MNIST Digits with NeuralCode

See how Egalosai's AI understands handwritten digits using the classic MNIST dataset.

View the Code

About MNIST

The MNIST dataset is a collection of 70,000 grayscale 28×28 pixel images of handwritten digits, widely used as a benchmark in machine learning research.

Why MNIST?

It serves as an ideal teaching tool for understanding basic neural network concepts like image classification, convolutional layers, and backpropagation.

Input Size

28x28 pixels (784 features)

Classes

Digits 0 through 9

NeuralCode Implementation


from egalosai import NeuralNetwork
import mnist

# Load MNIST dataset
(X_train, y_train), (X_test, y_test) = mnist.load_data()

# Normalize pixel values
X_train = X_train / 255.0
X_test = X_test / 255.0

# Create neural network for digit classification
model = NeuralNetwork()
model.add(InputLayer(input_shape=(28, 28)))
model.add(Conv2D(filters=32, kernel_size=3, activation="relu"))
model.add(MaxPooling2D(pool_size=2, strides=2))
model.add(Flatten())
model.add(Dense(units=128, activation="relu"))
model.add(OutputLayer(activation="softmax"))

# Configure the model
model.compile(optimizer="adam", loss_function="categorical_crossentropy", metrics=["accuracy"])

# Train the model with NeuralCode acceleration
history = model.train(X_train, y_train, 
                      epochs=10,
                      validation_data=(X_test, y_test),
                      use_quantum_inference=True)

# Evaluate model performance
test_loss, test_accuracy = model.evaluate(X_test, y_test)
print(f"\nTest Accuracy: {test_accuracy:.4f}")

This example uses Egalosai's quantum-optimized deep learning framework for MNIST classification.

Explore Production Use

Performance Dashboard

Training Accuracy

99.25%

Test Accuracy

98.73%

Inference Time

423 ms

Ready to Build?

Try this example in your Egalosai workspace or request a custom architecture.