Zing Forum

Reading

Xortran: A Classic Educational Project Implementing Neural Networks with 1960s Fortran

Xortran is a multi-layer perceptron neural network project written in Fortran IV, which can run on classic computers like the IBM 1130 and PDP-11. It helps learners understand the basic principles and historical evolution of neural networks by solving the XOR problem.

Fortran神经网络XOR问题多层感知机IBM 1130PDP-11计算机历史机器学习
Published 2026-05-03 09:13Recent activity 2026-05-03 10:28Estimated read 7 min
Xortran: A Classic Educational Project Implementing Neural Networks with 1960s Fortran
1

Section 01

Introduction to the Xortran Project: Recreating Neural Network Fundamentals with 1960s Technology

Xortran is a multi-layer perceptron neural network project written in Fortran IV, which can run on classic computers like the IBM 1130 and PDP-11. By solving the XOR problem, it helps learners understand the basic principles and historical evolution of neural networks. This thread will cover background, technical implementation, practical experience, and other aspects to explore this classic educational project that connects the past and present.

2

Section 02

Background: Historical Significance of the XOR Problem and the Fortran IV Era

Historical Status of the XOR Problem

XOR (exclusive OR) is a simple logical operation, but single-layer perceptrons cannot solve it. This conclusion once led neural network research into a more than ten-year "winter period". It was not until 1986, when the backpropagation algorithm was proposed, that it was proven that multi-layer perceptrons could solve the XOR problem, reigniting research enthusiasm.

Fortran IV and Classic Hardware

Fortran IV is a high-level programming language released in 1962, widely used in scientific computing. The IBM 1130 (launched in 1965) and PDP-11 (launched in 1970), supported by Xortran, were representative minicomputers of the time— the IBM 1130 provided computing power for scientific research institutions, while the PDP-11 became the birthplace of Unix.

3

Section 03

Technical Implementation: Xortran's Multi-Layer Perceptron Architecture and Fortran IV Challenges

Multi-Layer Perceptron Architecture

Xortran implements a three-layer perceptron:

  • Input layer: Receives two binary inputs for XOR
  • Hidden layer: Learns non-linear features (the key to solving XOR, as XOR is non-linearly separable)
  • Output layer: Produces the XOR result

Fortran IV Implementation Challenges

  • Array operations: Need to predefine array sizes, limiting network scale
  • Floating-point operations: Adapt to different floating-point representations of IBM 1130 and PDP-11
  • Input/output: Adapt to batch processing mode with punch card/paper tape input and line printer output
  • Activation functions: Manually implement simple thresholds or piecewise linear approximations
4

Section 04

Practical Experience: How to Run and Learn Xortran

Running on Simulators

Learners without physical vintage machines can experience it via simulators:

  • IBM 1130 Simulator (developed by Carl Claunch)
  • SIMH multi-platform simulation system (supports PDP-11)

Code Reading

Reading the Fortran IV source code allows you to experience:

  • Retro styles like fixed format, implicit type declarations, and GOTO statements
  • The evolutionary history of programming languages
5

Section 05

Learning Value: Understanding the Essence and History of Neural Networks from Xortran

Understanding the Essence of Neural Networks

Stripping away the abstractions of modern frameworks, face the core mechanisms directly:

  • Forward propagation: Transfer from input to output
  • Weight update: Achieve learning by adjusting weights
  • Error backpropagation: Error is passed back from the output layer to the input layer

Historical Perspective and Technical Inheritance

  • Understand how scarce computing resources shaped algorithm design
  • The backpropagation algorithm remains the cornerstone of deep learning to this day, demonstrating technical continuity
6

Section 06

Modern Insights: Minimalism and Innovative Thinking from Xortran

Minimalist Wisdom

Complex systems are built on simple principles; Xortran's small network contains the core ideas of modern large models, helping to maintain clear thinking.

Respect for History and Courage to Innovate

Criticisms from Minsky and others once stagnated research, but researchers' persistent exploration led to breakthroughs— this spirit still inspires today's AI researchers.

Interdisciplinary Value

Connects computer history, programming language studies, and machine learning, fostering comprehensive technical literacy.

7

Section 07

Conclusion: A Technical Bridge Connecting the Past and Present

Xortran is not just a retro project; it is a bridge connecting the past and present. It recreates history, pays tribute to pioneers, and provides learners with a unique way: understanding freedom through constraints, grasping the present through history, and gaining insight into complexity through simplicity.