NTL

What you should know about computers

I love computation and the theory behind it. Leveraging our knowledge of math, logic, and eventually electicity has been the foundation of our incredible technological progress in the last 500 years or so. Most people don't understand what is really going on behind the scenes of their digital devices or the complexity that is hidden behind seemingly simple systems we use every day. With this piece, I want to help develop a basic understanding of our computational world and give people the tools and resources needed to be able to take advantage of the power of information processing.

The Fundamentals

What is information and why is it important?

Information can be almost anything. Written language or numbers, images, sound. Even things you wouldn't normally classify as information can be abstracted and represented by some kind of data set. An example of this could be a rock. At first glance, the rock is not information but if you look closer and try to find the essence of what makes the rock a rock, you start to see that information is the only thing that we have to measure and describe our interactions with it. The chemical bonds, the temperature, the volume and weight, the color, everything needed to perfectly duplicate the rock can be abstracted as numbers and words.

So why does this matter? Our interactions with the world and the dynamic nature of the universe can not only be described by information processing and transfer but quite literally is nothing more than that. Every movement of your arm, chemical reaction, thought, or volcanic eruption, is a cascade of computational paths that, in combination with each other, calculate the exact form the information takes in the next "step" in time. I put step in quotes here because as far as we know, there is no frame rate of the universe so to speak. One thing to keep in mind here is that these computational paths may be fundamental to how the information behaves and interacts rather than an outside source deliberately using the information to acheive some kind of result. Think electromagnetic force vs finding the average temperature in Australia. Despite these both being the processing of information, we perceive them to be completely different.

These are not easy concepts to explain or even think through but if you take anything from the previous two paragraphs it is that information is the only thing that exists. We have conveniently classified different types of information and have become increasingly effecient at understanding the dynamics of the systems that each type constructs.

What is computation?

Computation is information processing. It's what gives us our perception of the passage of time and it is what governs the interactions of the smallest particles to the largest cosmological events. Almost every computational system can be described by the input, the actual processing of that input, and the output (information -> calculation -> information).

Looking Beyond the Theory

I want to try to keep this relatively succint so, although it is fascinating, deeper analysis of the full computational landscape of the universe is for another day. It may even be more of a philisophical question than one that can be solved using the brances of science.

Now that we have a better idea of what information and computation actually are, we can start to look at how we have designed our modern computation systems.

Data Representation and Operations

Data Types Operations

Math

Logic

Data representation (number systems, data encoding)

Abstraction and interfaces

Basic computer arch (logic gates, machine code, memory)

History of this too

Software controlling hardware (OS design, drivers, shell) what is a modern computer (cpu, ram, memory, human interface)

Shell (file structure, scripts, commands)

Basic program (file, environment, dependencies)

Psuedo code -> C -> C++ -> Python -> JS/HTML/CSS -> Functional/historical (fortran, haskell) -> Rust? -> Other specialization

Databases/blockchain? (Encryption, security, relevant to previous topics too)

Operating system design

Compilers

Complexity in hardware and software increasing

Future considerations