01 – Array Output: Building a Simple Autoencoder for MNIST Digit Generation

In this tutorial, we will explore the capabilities of EIR for array output tasks, specifically focusing on MNIST digit generation using a simple autoencoder. Arrays can represent various types of data, including images, time series, and more. This technique allows us to generate new, meaningful arrays based on patterns learned from the training data.

Note

This tutorial assumes you are familiar with the basics of EIR, and have gone through previous tutorials. Not required, but recommended.

A - Data

Here, we will be using the well known MNIST dataset. The dataset here consists of preprocessed NumPy arrays containing the MNIST handwritten digit images. To download the data, use this link.

After downloading the data, the folder structure should look like this:

eir_tutorials/d_array_output/01_array_mnist_generation
├── conf
│   ├── globals.yaml
│   ├── input_mnist_array.yaml
│   ├── input_mnist_array_with_label.yaml
│   ├── input_mnist_label.yaml
│   ├── output.yaml
│   └── output_with_label.yaml
└── data
    ├── __MACOSX
    ├── mnist_labels.csv
    └── mnist_npy

B - Training A Simple Autoencoder

Training an autoencoder for MNIST digit generation with EIR involves the familiar configuration files and follows a process similar to supervised learning. We’ll discuss the key configurations and visualize the training process, including the training curve and generated images at different iterations.

The global config provides standard parameters for training:

globals.yaml
output_folder: eir_tutorials/tutorial_runs/d_array_output/01_array_mnist_generation
checkpoint_interval: 1000
sample_interval: 1000
valid_size: 1000
batch_size: 64
n_epochs: 10
device: "cpu"
optimizer: adabelief
lr: 0.001
memory_dataset: true
latent_sampling:
  layers_to_sample:
    - "fusion_modules.computed.fusion_modules.fusion.1.0"

Note

One new thing you might notice here is the latent_sampling configuration in the global configuration, which let’s you extract and visualize the latent space of chosen layers during training (computed on the validation set).

The input configuration specifies the structure of the MNIST array input:

input_mnist_array.yaml
input_info:
  input_source: "eir_tutorials/d_array_output/01_array_mnist_generation/data/mnist_npy"
  input_name: mnist
  input_type: array

input_type_info:
  normalization: channel
  adaptive_normalization_max_samples: 10000


model_config:
  model_type: lcl
  model_init_config:
    kernel_width: 8
    attention_inclusion_cutoff: 128

The output configuration defines the structure and settings for the generated images:

output.yaml
output_info:
  output_source: "eir_tutorials/d_array_output/01_array_mnist_generation/data/mnist_npy"
  output_name: mnist_output
  output_type: array

output_type_info:
  normalization: channel
  adaptive_normalization_max_samples: 10000

model_config:
  model_type: lcl
  model_init_config:
    channel_exp_base: 4

With the configurations in place, we can run the following command to start the training process:

eirtrain \
--global_configs eir_tutorials/d_array_output/01_array_mnist_generation/conf/globals.yaml \
--input_configs eir_tutorials/d_array_output/01_array_mnist_generation/conf/input_mnist_array.yaml \
--output_configs eir_tutorials/d_array_output/01_array_mnist_generation/conf/output.yaml

I got the following results:

../../_images/training_curve_LOSS_1.png

Since we had that latent space sampling configuration in the global config, the latents are saved and a couple of visualizations are generated, here is one with the t-SNE visualization of the latents at iteration 9000:

../../_images/latents_visualization.png

Here we have colored the latent space by the digit label, and we can see which labels are close to each other in the latent space. For example, it seems that 4, 7 and 9 are close to each other.

Now, when we are generating arrays, EIR will save some of the generated arrays (as well as the corresponding inputs) during training under the results/samples/<iteration> folders (the sampling is configurable by the sampling configuration in the output config). We can load these numpy arrays and visualize them.

Here is a comparison of generated images at iteration 500:

../../_images/comparison_iteration_500.png

And at iteration 9000, we can observe the improvements in generation:

../../_images/comparison_iteration_9000.png

C - Augmenting Our Autoencoder With More Data

In this section, we will explore how to augment our MNIST digit-generating autoencoder with additional data. Specifically, we will add the MNIST labels to the autoencoder, which will allow us to conditionally generate images of specific digits.

The global config remains the same as in the previous section:

globals.yaml
output_folder: eir_tutorials/tutorial_runs/d_array_output/01_array_mnist_generation
checkpoint_interval: 1000
sample_interval: 1000
valid_size: 1000
batch_size: 64
n_epochs: 10
device: "cpu"
optimizer: adabelief
lr: 0.001
memory_dataset: true
latent_sampling:
  layers_to_sample:
    - "fusion_modules.computed.fusion_modules.fusion.1.0"

The input configuration now includes additional files to represent the augmented data:

input_mnist_array_with_label.yaml
input_info:
  input_source: "eir_tutorials/d_array_output/01_array_mnist_generation/data/mnist_npy"
  input_name: mnist
  input_type: array

input_type_info:
  normalization: channel
  adaptive_normalization_max_samples: 10000
  modality_dropout_rate: 0.2

model_config:
  model_type: lcl
  model_init_config:
    kernel_width: 8
    attention_inclusion_cutoff: 128

Note

Here we see another new option, modality_dropout_rate, this will randomly drop out modalities during training, which can be useful for training models that can handle missing modalities.

input_mnist_label.yaml
input_info:
  input_source: "eir_tutorials/d_array_output/01_array_mnist_generation/data/mnist_labels.csv"
  input_name: mnist_label
  input_type: tabular

input_type_info:
  input_cat_columns:
    - "CLASS"

The output configuration has also been modified to accommodate the augmented data:

output_with_label.yaml
output_info:
  output_source: "eir_tutorials/d_array_output/01_array_mnist_generation/data/mnist_npy"
  output_name: mnist_output
  output_type: array

output_type_info:
  normalization: channel
  adaptive_normalization_max_samples: 10000

model_config:
  model_type: lcl
  model_init_config:
    channel_exp_base: 4

sampling_config:
  manual_inputs:
    - "mnist_label":
        "CLASS": "0"

    - "mnist_label":
        "CLASS": "0"

    - "mnist_label":
        "CLASS": "5"

    - "mnist_label":
        "CLASS": "5"

Note

Notice here we are using some manual inputs in the sampling configuration, which will allow us to generate images of specific digits.

We can run the following command to start training the augmented autoencoder:

eirtrain \
--global_configs eir_tutorials/d_array_output/01_array_mnist_generation/conf/globals.yaml \
--input_configs eir_tutorials/d_array_output/01_array_mnist_generation/conf/input_mnist_array_with_label.yaml eir_tutorials/d_array_output/01_array_mnist_generation/conf/input_mnist_label.yaml \
--output_configs eir_tutorials/d_array_output/01_array_mnist_generation/conf/output_with_label.yaml \
--globals.output_folder=eir_tutorials/tutorial_runs/d_array_output/02_array_mnist_generation_with_labels

I got the following results:

../../_images/training_curve_LOSS_11.png

Here is a visualization of the latent space:

../../_images/latents_visualization1.png

Here is a comparison of generated images at iteration 500 and 9000:

../../_images/comparison_iteration_5001.png ../../_images/comparison_iteration_90001.png

Now, since we added those manual inputs earlier, they are also saved in the sample folders (under manual), and we can visualize them:

../../_images/combined_plot.png

So indeed we can see, in the absence of the actual image to encode, the model uses the class label to generate the respective digit. While not immediately obvious, the generated images of the same class are not completely identical (although they are extremely similar), due to some stochasticity injected into the model.

D - Serving

In this final section, we demonstrate serving our trained model for MNIST array generation as a web service and interacting with it using HTTP requests.

Starting the Web Service

To serve the model, use the following command:

eirserve --model-path [MODEL_PATH]

Replace [MODEL_PATH] with the actual path to your trained model. This command initiates a web service that listens for incoming requests.

Here is an example of the command:

eirserve \
--model-path eir_tutorials/tutorial_runs/d_array_output/01_array_mnist_generation/saved_models/01_array_mnist_generation_model_9000_perf-average=0.9688.pt

Sending Requests

With the server running, we can now send requests with MNIST data arrays. The data arrays are encoded in base64 before sending.

Here’s an example Python function demonstrating this process:

import requests
import numpy as np
import base64

def encode_array_to_base64(file_path: str) -> str:
    array_np = np.load(file_path)
    array_bytes = array_np.tobytes()
    return base64.b64encode(array_bytes).decode('utf-8')

def send_request(url: str, payload: dict):
    response = requests.post(url, json=payload)
    return response.json()

payload = {
    "mnist": encode_array_to_base64("path/to/mnist_array.npy")
}

response = send_request('http://localhost:8000/predict', payload)
print(response)

Retrieving Array Information

You can get information about the array type and shape by sending a GET request to the /info endpoint:

curl -X 'GET' \\
  'http://localhost:8000/info' \\
  -H 'accept: application/json'

This request will return details about the expected array input and output formats, such as type, shape, and data type.

Decoding and Processing the Response

After receiving a response, you can decode the base64 encoded array, reshape it, and cast it to the appropriate dtype using the information obtained from the /info endpoint:

def decode_array_from_base64(encoded_array: str, shape: tuple):
    array_bytes = base64.b64decode(encoded_array)
    return np.frombuffer(array_bytes, dtype=np.float32).reshape(shape)

array_np = decode_array_from_base64(
    response['mnist_output'], shape=(28, 28)
)

Important

While the original output arrays can be of any dtype, and that information is provided in the /info endpoint, the response output arrays are always of dtype float32, which is the output dtype of the model itself. The model output is then un-normalized using the training set statistics (assuming normalization was used during training).

For example, since these are images originally in uint8 format, we can process the response arrays as follows:

from PIL import Image

array_np = (array_np - array_np.min()) / (array_np.max() - array_np.min())
array_np = (array_np * 255).astype(np.uint8)

image = Image.fromarray(array_np)
image.show()

Analyzing Responses

After sending requests to the served model, the responses can be analyzed.

predictions.json
[
    {
        "request": {
            "mnist": "eir_tutorials/d_array_output/01_array_mnist_generation/data/mnist_npy/10001.npy"
        },
        "response": {
            "result": {
                "mnist_output": "AJATvgCEPj0A8EO8AADVugAYmjwAADS5AGDWuwAAcDsAqJK8AHCQPQBUIr0AkGy8AEBCvACA9roAgFM7AIDaOgBATLsAsMU8AGCMOwAAIDoAgGO7gMDivgBg5zsASJc8AADSuQCwoTwAjqs9AOjuPADg8L0ALAk9AHQjvQAANDkAmKs8AGCsPQCwLTwAHE29gGNzv0D4Fb9Anh8/ADPyvmAJAcCAqyI/AB44PiDpzz8AfVc/QNyeP0DXhj+Ahyg/gCrbPgDMgr0AQKk7AFB1vADAczsAwCk8AChnPQAgOz0AvBK+AMxYPQCcmT0AAIG6wKs/PwCygb0AkUk+QENjvwB5G7+Aoqu+8L7eP6DY6r9A/lbAoPOXP8Bo27/g3WvASOG1wBi2IsFAn/nAUA7VwHAlv8Cwok3AAFBNvQAoyD4AgFQ9AFTivQAcxD0AKAQ9AHcTvgCERz0AcO28AIiovABzYj6AMCa/wDUFP4DGhz6Qn27AAICNvzQw80CA05G/RM4xwSCBacBgUrm/EuQBQQCe5r34XJDAgFIZwABOgT2A9LzAwISVwIAjDT+Adre+AGjSPcAFXz8AvuI9AChnPQAuFL4ACP88AMjVPQCMqb0A9ZC+QHWdvwD/Er+ozk5AAJ11QKCrKkHMYiZBAMVgwAB8db+A5wbAQCRzQAD+370crrLBS2OUQtJBLkNOQDtD+mvHQYBzxj4A/Go+oHhEwACizb+4ugFAAAk+PgAqsD0Apxi+APAPPADTcj4ABe8+ANfgPxClB8BA8BrAsFIvQOAlnEAQ2BVAuHjNQCBynb+AOxo/cPwGwTxQPcFA92vA1j9fQl1cKkMlxJFD9vhoQ3pUZUJwR33AkK6sP2DhW8CAhcq/sAauPwCA4rsATCs9ACgbvgAMPD0AIsw9wOtnP8B/oj/gboi/UHVMwPCePcA4kjpAgjwCQUCmDMHg8nnAiNE/QIiRW8EMZdpAnEqmQglYPUMmbIFDPz9zQyeiFEP3nDBCRAm2QXCfcUBAVN0/AFDFvABtAL6AkaC+AIAQPQDFNr4AIpU9gKKnvvA9A0CAZoo/QLg3vwAa9T0AiCg9QNYBP4xOBUG0DxPB+EPtQHQcpkCqvNTBLEp7QjGlM0Ocbn9D/dWBQ3EII0MiNVpCUHidQmcICUPtM4RCACTKPWCB3T8gjui/gFWbvgD0ST0ANya+QDM5vwDs1L1oemlAEMTlP4AWNcDAWBs/gCc0v6QTAEH45QtAMAPhwIAzTj8ArpDBk63CQU/EOEPgoIVDDpBzQ4qaC0N4J9ZAcPJgQr5dU0MYFIRDRtUZQ9BOKsAArWU/ADtLPgBalr4AiNY9AEUOvoDjFL8AkA0/qH5CQABazT5A/FXAYGm6P4BdLMCwdgVA4IK7v3DsmcD0GgbB2EWtwF4Hu0LGJWpD/xB9QyZL2UJAhFG/ij34QSe5IENSmoFDz2qJQ4VGEEO8IpxAANBSPADLgT7Alg8/AF6cvgAHDr4AILU7AI0PPoAlsT7g+zTAIKjRvwCERj2QTwRB4McywKBd0MDQgUPAWrmBwcpamkLZwk1D676DQ+PcOUOotYVBs0QgQoFcJkNh14lDtMx6Q2TvL0MouWFCSIFlQAAsbb7giR3A4JObP4AUxT4AWSK+AFQSvQBmAr+AnTi/wDf0vwAA6j1Q0WfAiGP4QIAKZb9AEkG/5KoWwYwBFUEA5gNDtV96Q9oubUOTtgtDU1+dQipuMEMjWoZDd8uOQ2wEJ0NeJotC+Lq9wCD5rr8ATqq/4PKjv+CIgz+A0O8+AKUYvgCgaDwA+6K+AC7vvQC7ZD5gjqu/4KzOv9AzBUAszNdAUPMRwLwJWcHAQnlBMxERQ5NabENe7TFDYIIXQyWNakPINYtD2BVvQ9J0NkNUpA5CfjhTQThv3sCAEs2+4APMvwDeLj6Aig0/AI64PQAvE74AoPM8AJYwvoCuuT6AEAxAoDzrvwBdGj/goPy/PHu+QNCOnUByQ4XBIIX8QcyZQkM4zYVDDEloQyPCXkN8kYdD1VmBQ6JhJENxDWFCiKStwCByob9wwEBAwJ8IvwD82L0A3ZY/APCxPQCisD0AIyC+APRMPQA3Fr4A5EQ9WNYsQIAOdL8A5JW9+PNXQIAAzUC+mNJBxlS4wTC840EAxT9D6zqBQxeHgUNAqoVD5hJnQ3tgDUP8xj5CTCtNwVBaeEBA0z2/AIxXQAAihr6Au/U+OOUFQACTqz4AzEE9AMQUvgBUiz0A4HW+AMwvv4CIkD4wgcg/IM2cvwD6Jz9o25fA/FDFQOAMRcE/mINCsbpaQ9WXiEMSkYRDnUxrQzVSAkM6JeRBLFFswSr1tMHAUhXAAAqMP2D2AsCAVY4+4LKdPwgDFEAAaLI+AP68PQDtFL4AkA49gFMDv+Dczr8QGhXA4DE9QCCQWsBQopvApPc1wVaOQ0ECnrVCX9BOQ6HZhkOkjINDixFtQ/hQCEOAwmNATP4GwfKigkHAGJjAmKzUwNqpKEHQUak/AOwxP4hkBkBgmypAQEJVPwBEgD0ACya+AED2PIAFQr9AiDfA8FGWwHCPsD8AKMy+cMrpP5RANEIA8P5C3yxcQ+TtjkOwRm1DmhdqQ4wfhUOsuC9DUIQYwcRdy8F6pkJBkGxqQSBiz7/g+nVBGCYhQADHnr7wqrE/8N/PP4CavT4A+Go9ACYavgD48jyA5ce/ILTbv7iLjcAAmM4/AN/WvrRwQEEHRQhDYu9gQ+2xhkMGt2ND9L76QsgA3ULJHTdD8FhlQySNu0Jcr1PBnNcCwYjgz0CwG9rAQPCWPwAKST+AWIM+wJ6FP8D1rz8AANq5ADaYPQA4D74AITg+sOMSwAAWQr4Auge/IKTJP5BaZMCkxgdCOHxaQ1dIgkMxg1VDxRKDQgiwzMF0IGjB+qBmQoJvg0ORSzxDdz2MQeCJW8AQWZPAADVjvySZCsFQXQHAAJ6/PZB24T9ALlM/ABqcPQD+gD0AdRe+ABzFPcBn0L8A0CA/AC4bPgCaiz7AJqK/UCS6Ql6gg0NtxoFDfAUOQ2AUr78AcmvAZL5EQr3/GENvH5BDG0tUQ8plAEKAgpC+iMK2QAqCekHAugDAgHCOPoD2KkAoBkFAAAFuvwAJXz4AsNc9AIoivgC3Cz4ASAS9AJ0EvgDjFj+oaypAgF/APoA6yEIX4oZD1N2BQxQnQkOstjJDvjhXQ/WHaENyX4VDb2iHQzEvSUPohptBcFpmwMBZrUDo5QpAYB4VwMD7W0CITp5AoHvXP4Bh0r8A1Gs+ABCFPQAwKr4A8B+9YDiLPwBxsL5gaz3A0Pt3QAAFVD/TCgFCrRVQQ4m0g0NUFntDgqmDQ+5BgUPpyIFD6ex3Q9jsPUOgXtpCQFBuP0BeHz8A830+IMhKwHDGbMCAInA/HEerQAA6mD/ADVa/AMrRPQBcCT0AJBK+AOwMPYCeyT7ADhU/cCZZwNj2M0DQ4E1ASDXKwDTW0ELhbktDBIZfQ9ltcEOdj1ZD+Bg7Q4aQxkK+cfVBcJ5HwIC/58BAK46/AK+uvuioPEDAsxo/gGYqP8iYMEAAB1Y/ADgOvYAz1r4AqDs9AJkQvgAQ/jwAqyc+QBsGPwCYFL8AlAc+YCcvQAAdoj84BdlAQCc3P0aKEEGgmg7BpIYaQdAOJ0Ak15hAeAzkwCz+g0CUkw9BuF3YwABEuD4Q3UJAsJkyQACgBz+AUGo/AOJTvgAQWz0AAQA+AGQaPQArFb4A0Ow8AGi4PADoGz0AexQ/wA80v8ANGT8AoLA/sJ0JQKCuVsCA+gjA4CI6wEAxV78AvQg/QKyePwTigEDYWdTAAIFwQHDPvz9AZRE/YNzHv6Cb9T8AmAI9AC/OPgC0MD0AsM48AHIGPgBsDD0Auxi+AIgTPQAIn7wAAGQ6AHBfvADvGr+AAoa+IO2NP4DZyz6A9VfAsC/aP8Bhh0Cg+/G/cP++wEBzWUCAc4NA8BGzwOB+ir+wQAVAADx9vgCXIj8AdfQ/ACCpuwBwljwAAAO6AMABPQCGiD0AEOM8AIgVvgAsHj0AoNu7AICuugBAODsAeIC8ALAovACMEz4ARWo+QJYnwACm7j2o4jVAYPjyvwCxpL7A72o/4MyqP4BB/D4AOMW+ALSgvQDmOD+A2w5AAOCSuwCYZL0AYJQ7AIA2OwC4xjwAVJY9AAj0PA=="
            }
        }
    },
    {
        "request": {
            "mnist": "eir_tutorials/d_array_output/01_array_mnist_generation/data/mnist_npy/50496.npy"
        },
        "response": {
            "result": {
                "mnist_output": "AH6AvQCIt7wANjm+AEAvOwBwyDwAADa6APAuvABAcjsAKI28AHjFPABQWL0AsLW8AAAOOwCAwDoAWK08AACROwCAQ7sACtO9AMDcOwAA4joAIIC7AKTcvgAAQTsAKKk8AGCzuwDNFz4AdrQ9ANYHPgAgkb0AKKS8APzwvQBALzsAQJs8AJSXPQCAQzsAHxm+0FQPQIANyEA0xaVACEiKQGhpiEAAFLE/IOzjPyA5/z8AanU/AGvKPgC6YL4gk4I/AKSTvQBw/DwA4Lk7AMBoOwBAYDsAPAc+AM6bPQCQED4A7IS9ADDTvAADFL4AoOA7AE0mPwD6gb2AhaC+IKi3PxAVR0GI+Y9BxtjnQTLrIEJmGTFCpI/tQcwbEEL75/pByfiVQYiNk8DIMZPAQBbQvwDwqLwAu52+kCyzP8AQJz8AdP09AIjWvABE1z0ADQc+ABiRvQDwFrwABje+AEgYPYCLqz6AAnS/wBUTP/zeMEH8xERClt+dQgy0+UK/QB1DXsMZQyjO1EIdAihDNCYOQ9pKiEK8IyxBQGMkv9Q0+0AAMfk/YH9XwKB53T8AtQo/AFE+v8BpPj8ALPA9ADwaPgCQW70ASIa8ACMLvgAE+70AVtu9gMonwCB+NcA920hCol0aQ9vqREPpTW5DrZVpQze7SUNENFhDhOleQ0CNf0OpCINDK9YsQ4IJDkIoI+tAeE++QMALTb+YsEdAgA0BP3DwRMBAFx4/AORzPQBRLT4ASGi9AAhxvQDUHb0AaJG+CIYWQKjqxMCMqd1ApHADQ2AhQENn0SxDZw8DQ/DUGUJkizlBoLBZwELxhUHO3/lCpVxuQ96VeUPi+/xCgPFdQDANpkCwahVAIH8XQMBkjr8A1pu9wHtMP4D+yb4AvxA+AJqfvQDAY7wACRO+QH8Xv7CNH0CQ09rAglcKQvz/JENQAAFDy8kdQs7NrcG8CIHBwGbRwBDDUcBI4MDAzDonQZlXVEKJVERD4qZfQ+jlUkIQmaFAdOqVQCwaikAAMWc+sK1WQADHnT5AJvS/APR7vQD4Vb0AmIU8QM8UvyD+hr8APdm+0NmswJDYGkLSMMZCA9FsQuosPkEo65DBZujAQSBsyT/IGM7AVBauQYBkBz+06EBB4ClEQ/79hUNSsulCIGP+P8Bl1L9ANbo/gGC8PoDhbL/AbS+/gI7ivwC6sz0AjJW9AFD5vYAmg74AhJc/gKVhv4Drqb625jpBkATOQXBb4EHMU8dAGH1oQADAwrwA138+mHviwAA4qUGYqOLAMGSAwXzwHUPySGZDDIEKQxAgI0FAx1vBAHixvEBwesAgkrw/QK8HPzAmCsAAKO08AGxkvQCpF75A+R8/sArcP6Bfnz+wqR9AAFO1vuDWmr8QetnACBL8QAjZJ0FoLpnAcNgLwMS/LMFIxjlB/ksIQdAy7MDrYyxDrSJ7Q2iQy0JAeWDAgEhXwOA8k0CAngw/kBCgwIBU0L6Ac/W/AM5vvgDcbL1AsAE/0FHOPwDkST9wvNo/gJtlPwD5cD6AC3a/8CSjwMDq8b+Ig7DAWK3KwHZ/IkHgr9+/+K4sQHC4XcE+TZ1BB5NQQ1neakOgvYdCLlUhQQDUOb3AAtS/gPG+vgAPPj4AGD4/AOolvoAMuD4AKpW9AGBovCA9hD8AkrY9AH4SvhAZBsAgnxXAdKEewUrTF0FYNXtAGLr0wOC2nD/A7bq/UBYPwE4BOUHwiu7AgP3AQkXkckOo77lC8GgAQQC2Cj8MFwzBQCIHP8Cf87/Qc+VAoIvnP4Du7j4Aybo+AAh5vQCiDL4AoAQ/gH5iPwCwTz2IJNXAgK5dwBz/CcH42o3AWPWUQBAasMCwnr5AgMOQP0hdEEEy8MLBKVvTQe4iaEMG8WlDyxE6Qtg3+sBIbQRA6HOVwKAOikCYrURA4Nnsv2D4wz8AOT0+AAgQPgDgib0A1LG9ADsVvoDvnb5wW+Y/YApZwACRYT8IgQZA0DP/P/DSs8CgixPAmBGswfCQkMC4k1VAGLZDwfK3vEI9WndDyygWQ84DC0GwGXtBBmVjQaykjEDQ2dZAmG0BQFwmUsFgzqg/ACEUvgAY2T0AYG29AOxZvQCC/b2A56G/QInZP6A3xD/gXoA/Liw0QXypuUD8wQ3BSE63wFgFwsB8gqhAODPDwHOfFkJ7RjRDBkt4QyuVrUKIEZHALuU+Qegq5cDAwuq/AJRUP8AZfb98H3XBwIjSPwCzcz4ATOo9AJh1vQCymL2A6Za+QIr0v8CIKL/oWTVAgKxtv3CYNsBAtSfAKqA4QbYPhkK4aAFDRL0PQwBGCkNCuztD+7doQwzmGEN6TNFB4OCJv1xKdMEepafBIAOIwECTyr8A0EW9UGfDP9CeHUAAxaA+ACoaPgAair0AVJy9ADfKvkBKIcAACFS/wEh5v4BztL5soLbBCjYAQpAB/EJPu2pD+M14QyjHg0MvZ2xDPZeKQw6YYUOUjydC8CBJwI04h0H2qgZBiM+bQIMfhEEsAkVBkL7WQHqB00FEPZFAQNVZPwCofj0AQpG9ALaEvUDtAL/Q/BPAILTHvyC+v8C4LC7BbC2UQaT4DUPeF1lDC+8KQ0Uiu0LObfdCERglQ9Dhh0OSiXpDepC0QggyVMH0kmLBALwjPfipMME8u0LBIGCjwCiwncC0cEtBkN1tQADEFT4AoA0+AAKNvQDoR70AZxu+4B+Hv8ArA8AAxsa/4sOhwcw28kKy5FhD8hISQyxxLEGQOF1BuuKSQjGgN0MLz29D4YSEQ3XCKUOHbMZB9No5wUNuRsKG9ajCiJuVwhBPnMJlUADCuR4awoDkzj4ANCi+AO0ePgCodr0AABw+ACCoPgDAfD8AWme/AFOVPiYHAkKfe05D+cR0Q8z4VEJIkcHB8HFQQh42GUP2pF5Dik79QurRXUOmqU5DlQ42QxiH9EKUaTtBAFBnPxOphEEWh7xBrCgqQsdIKEI0v+tAAKDNuwBAMT4ARHu9AACSPcCZLD8wxFFAgJGlvwC1+b8bJCVC5hd2Q3r8fkP+9/tCbf0IQ0TyN0PxoGRDT8Q0QwDo2bycVUVCr4Y5Q/PqikNq6nBD2q44Q8IIWUPjF2ND8HpVQ4SaKUPzlSdDpcSbQQCWrD0AIUM+AOKfvQDnCD6Aupg+OFEkQGDKkD/YfptAmIT8wPo0K0PBFWRDvktdQ+FkgEM5c2JD+7gVQysry0H6Q97BI7+GQV3EOEJsyepCSVkiQwcBIEM/IRpDiuoaQ9AQwUIlRxtDRtT0Qr52lUEAcCM+AKUsPgBilr0Aqu29AKoNPgCqRb4A4EW/hBPyQAjL8cC6FPBBfqTGQjr/xEK//41C7pAOQpjqzMDAQxrBuOUIQSCIykAyVoTCQGXnwESO60EvwoBBBIcTwXHUsUFiV5jBNNw0QjMoBELkcPBAABjhPAB+Aj4A/oC9ALjuvIBp575gPOS/wL+IvwBet73wMLY//NxHwfCgJ0Dg1J6/YMP1P76DhsHA7I3BwMDIP0ZAksEwNmhAAChQP/wFvkCwDrXAav+7wZnOA8LoNbzASBwcwTD9KsGoKp5AgEhhP0DXBL8Adgg+ALyEvQCA1rwA1OK9QP4+v0C5VD+w5gXAkH6oP6BZjb8geelAMDa2wKhpfEAAKYI+WOKWQCChSMAYMKTASAf+wOBWkEDQs0PA4DGHvwDGkz7oHpJAYPXfQAQ5iEBY3hzBAEduPgB2kj4ASt49ALETPgDwc70AgK68AHjhvQCQtzwAOoc/gEWtvhh0I0DA68zAWxWEQYDfr79UQgLBXmcKQcA72z/wTd7AgDMTwDitcMFkqD7BGNX9wHDkGMAgjNM/7NMOQYCrwUCgWZZAsLogwOBqh78ATAA+AFDSPQApHT4A9H29ALi3vAAVHL4AAII7AKAMvIDf5r4gl5U/YOjavwhgkEDgM1bAiHklQExci8FcvyvBFA/PQKOihUGcAeNA8JcDwNCsVsCgJpzAAESyvmj7p0Bg80FAgIZdPwBw1T0AQDQ7AG8UPgAAqz0AmxE+AJxovQAcAr0Arg6+AACsOgCAmzoAMAG8AK77PYARvD4A9ZA+4Bf9v4A9pr4A/Le+MF4JwAC5qT5A2eY/2DdWQBA13D8ACtw9QBl/v4CjkD7gBfg/AExFPoAauD4AABC4ALBHPQDvDj4AQpM9AGcPPg=="
            }
        }
    },
    {
        "request": {
            "mnist": "eir_tutorials/d_array_output/01_array_mnist_generation/data/mnist_npy/25640.npy"
        },
        "response": {
            "result": {
                "mnist_output": "ADKzvQBYbT0A+U8+AIDUOgDgjTwAAI+6AAAbOgCARLsA8NW8AKTDPQAQTbwAULu8AIAWuwDAersAKOI8AABXOgCAMbsAvKK9AEBjPABghjsAABY6ADX2vgB0GT0AADQ7AABwuQD8ib0ARhC+AACXugAoqr0AeGM9AHALPgAAI7oAEKG8AC6XPQAkLb0AZpg9AOV4vwBAejsADAi+QJnyv5DJrj/wANA/AEhSPQBZ+T6AkU4/YMvPP8Aotj9A7Ck/gO9OPwDQLb4AoHK8ALBgPAAADjoA4HK9ACYAvgBACzwA6Ju9ABCCPQCeHT4AwHq7ANONPgCYmrywLMg/IO+yQJS8wkCAvGc/QK5eP/BMVMBU74NAWNjoQHCkVMBgFDVBqFXhwGieAkAi/ChBwHaOPyBK1r8gfaS/AJ1SvjDp6z8Adsi9gKCgvgAoob0AwCW8AACdvQBWhD0A8Fw+AJMuPgCgeT3AaUG/oOWBP9bbLUGA7htAABILPzD0zT/E9VPBINkuQCA4k78AgUi+aLwGQKCUgz+g1a0/AMOpvkAVOj/gzSzAIJdOwABbhj8g4pg/gM+CPoAIDD8AKwO+AOAUPAAYkb0AbFk9gCSbPgCQ3r3AhiS/AMC7v9DTJMAYyoNBpACVQBo+QkHAhrdA4I6WQFDGO8BgbPjAUFIWwQC60L9keYtASEvvwMzgL8GcL8JAGFGCwMDisb8Q+xZAKJoqQDBtyz8AtHk+gASrvgDIHz0AJqO9AIDTOwBKyj5Afw8/CCsuQJgNusBIMOHA2E1LQThjmcAgifI/cOOpwLhOk8CIyuDAcjhPQfBeysDEk5tAQlMuQdQmMsHu31JBQHRfP76GCkGAJLe+APnpvmg7S0BAAo8/gO3BPkCHPL8AkH48AOKyvQBARj0ASeA+AOUlP7BRIUDwS3jAIA+Lv0CWtcCAlug+QPBJQAhnykAWT6xBNCKpQnMzREMlCINDiYRoQ5SeCUN8DbBCeqzBQVhNs8CMGsJANEmSQMCR6r9gMyZAMMcpwCBVkD9AzBW/AIAGvgD+rb0AYKM9AG0xvgCwHj0APjo/ALMbPlVfgEEbji5C4qWIQixf9ULhpDpD+eRXQwwaaUMZ33lDlUWEQ+hPhEPRDmxDBc9vQ9ghOkPW801C4LiRP6ApjL8A3b6/wD8Qv5BwR8AYeBlAQJkQPwBGib0AXPu9ALyQvgDuAb8AsjC+gFa1PpALZsAQGXpCfog0Q420TkOrvXVDPwGAQ+0Ja0MCUIND349jQ3m7XUPSulRDuNN9Q+KCiUOQ7nJDoDMiQ1iV/0FwUtPB0KBHwFCzmMCY0jFAEJY+QAASAD8AEB48AH7dvUDlBb/gf5s/wNkbP7j/m8CUfCZBs+AWQwP7kUM1uXtDVORsQ6WmZkMBm19DQYo7Q7a73ELCkHZC5qGpQitFGUPuzzFDq591Q9R0gkNkGNlCyMzUQOD5McAANVE/oNiYP6CJOkDABoY/AIzpvgCUub0AJHm+mHkNQEhSAkDgXjfBWmWSQtXGf0NZ/4hDei4pQ/Y+0EI2MuZCjG6vQgaiTELYvbjAhFZMwXjWJEDAL9JATp6UQtNkV0MCF5hDNspMQ7CCREJAqjXAUO25P5DMrMBQXrE/AEJ2PgBVlD4AVNW9AHcWvnCSJUAAIK68OAYTQL4IDUPL7YhD3BpBQ8FrhUIYwwhBENWfQah3JsHQHmPBUPhvwKB1qD9xgJ9BqHRLwSRMnUEaDjtDH42LQyFpZ0OkL4FC4AeGvyC1xb8IAZjAAFmAPwAQTr0AGuY9ALTJvQCgPzzIpQJAwJ2fvwmmt0HYtRZDClxnQ0SeCUPiN2NB9l3AwSRnHcFAVvi/8h0BQYSd/kDgiYw/oDYewHSBHcF4fgdCqmcdQzJjbkO76UdD604fQvjOikDAv0XAgHzXv0Dzlz+Ahdy+ACDeOwCgmL0A+Bs90Pa4P2A4979chMVBFAT4QtDeWkMXER9DnMRRQiiD+8DAPEk/vPw+Qd4IHkEoOlhAAHznvaQTMEHMJGvB/PKLQlo7UkMeAHpDQGZDQ8g6SUEwigJAoGJFwAAIXD5QRL0/wJVXvwDAubwAiNC9AIjePEDnIT+gzua/SCv1QMqntUK+OH5D5QBvQzDu/kIsa/hAHoC6wfC1TMAgfjnAAE9NvvoGqUEorejAQBQiwYD1aUKBYFxDDWyGQxBIZEMbzxBC0FV6wAA6GT/A8ts/kLgGQACQDj0A0A68AP6lvQDI+DzA+wA/4COsv8xKB8GHYm9Ca3VpQ+J8lkOF8mNDAhDEQih8uEAMDiLBQNNBwMAeM0AkZz5BPExjwQR2hcEqD4xCUjNaQw5UhUOjoXpDAe2bQnApncDwh8FAAF/PPzAPDkAACkO+ABQMPQDamL0AgAQ7AHDIPmDnxr+EHQXBuB3lQdzD00KvC3lDmGGXQ261bUNh3wtDmsmJQqzvCcHY947AyMbrQEinrUCen1hCNNAXQwKSeEPyI3pDq82CQ8QUw0IAwWS+yOqSQMCQBEBwV+U/gCGiPgAQU7wAOKi9AOjJPACd0T5AouO/AG8MPnaNHkHglow/ZL37Qiv7ZkOx0IRD2AR+Q0jSMkNw4fFCmsacQtb710JgWxVD2tQ9Q95WXUMrNHNDfap8Q4XLakPqnZZCAK9jPwBY9bwA+rW9AJosP4Dghr4AuJk8ABTuvQAEWD1A1Uq/IHorwBA63T+w21lAoHoswURNf0HI5NBCzCoxQ9ZoikPp/ZFDMjt5Q7xBZUOV9lVDoTOGQ6XdiEMY4YJDxrBvQ0f6bEOcKy5DtDecQahyJUAAsQc+oEaCvwDd+T6AgPq+AMDDPACYlb0AoAc+MIodwODb+b8geDDAAEjpPkQn5kAkKEbBPdDqQShEG0L8TzJDuthrQ2trZEPUI1pDCGghQ2utT0MmhFhD6DZuQy2eiUOC0G5D34UNQ3Ad7cCwJdI/gLplv7BCRcDAFFY/AO1LvgAo0TwA9KG9ABB6PPC6GMAA5ME+KO/FwIAatb4wZxrAsDfXP4KbnUGg2DrBDkSoQSXagkKE/q9CbK6BQs2Ju0EtvgdCsJe0QnsDKkM5lJBDS6yAQ+U4F0ObNbJBoJD7v8CUhr8AkKy/gPLqvgD2N74AqGQ9ADS4vQC2ij3AtZa/qA8pQGACz794JDRAgJHOvsCF1r80rJ5BFFiCQEQaDcFs0wbBwKndwPY3HEGUPyDBNE0vwTDemEFIkBhDXsmNQ5Ftg0OQxflCJgj/QcDBaz8AYoW9MKb4P0BEpL8AXkS+ANibPADk0L0A8pm9ACiKPUgLAkBADq4/OIZIQABbGD5gqb6/wFtbwKBu/kDgtv2/0HciwLSBA8F0LWtBlIuPQGqJg8FQZNNBfqgQQ6XeiENByIZDxtnwQnaR+UEI31lAoNK9P8DKxT8A8xK/AOyUvgDAQ7wAbpa9AAAsPQCC4L2AcrA+gJFDv/Dz/T+AQAu/oAbDP3zyG0GINMPAiC63wEB6hUDgtOTAepYGQaSYC8Fgqo8/UPrfwEqFtkJau2pDtdCRQ2lbNUPcgOdBwBOoP4C1ZD9A0Vc/ALqYvYAjbr8AgN+6AMKXvQDoKT0Aebc+AEdtvkBMYb9AwHm/QJxJwGCLgD8AfD4+eKqzwETPhEBI2QNBCNqoQOB8ocFEPaNAhFq+QELnucGc85tCnhQ/Q8ATmEPUJ0JDDmipQQCgSD4AwVo+AN1JPgAM5T0AdO69AHALvAAap70APB49AJBnPgCIhzwAaRS+IJGnvyhUhsCIRARA6KkqQACoqT5QCcBAwNcHP8CEkEDCY6NBql1vQfTrJsG53wnC+a4KQqCp6UIlWFtDxg7gQmyrKkEA1aI+AAnHPgArJD4Aqpa9AE0JvgAgvrsAAKa9ALBKPQDNMj4AgDw7AI7cvUBiIb/ADQzA0DEGQECECUBQ7gLABMamQPCbKMCAwcrAdvgLQRxKE8GsaBnBThCEwdIjAEHJTARC7BGIQgu3E0L0rJdAQCwFPwBw6jwAAKC6AOhXvQCw3b0AoOq7AGalvQB8Mj0AWTI+AAATOgDgg7sAqKg8AORbPQBSnj3ABHE/IJXSvwD/dT6Ad9o/gJCRv+C4rz8QDA3AQJKuPwB0pT5AtFs/gHkgQPQuiEAw4nRAADBbvAAsL70AAE48AICTOgDoX70A/QS+AACZug=="
            }
        }
    }
]

For example, using the approach described above, we can visualize the generated images from the responses:

../../_images/mnist_output_0.png ../../_images/mnist_output_1.png ../../_images/mnist_output_2.png

If you made it this far, thank you for reading! I hope this tutorial was interesting and useful to you!