Teaching kids at school about IT history and 8 bit computing

Feel free to talk about any other retro stuff here including Commodore, Sinclair, Atari, Amstrad, Apple... the list goes on!
User avatar
svenvandevelde
Posts: 488
Joined: Wed Dec 23, 2020 6:30 am
Location: Belgium, Antwerpen

Teaching kids at school about IT history and 8 bit computing

Post by svenvandevelde »


i just opened my laptop and see this proza. amazing!

KICKC home page by Jesper Gravgaard.
My KICKC alpha with Commander X16 extensions.
neutrino
Posts: 182
Joined: Wed Oct 19, 2022 5:26 pm

Teaching kids at school about IT history and 8 bit computing

Post by neutrino »


https://yewtu.be/watch?v=EJh4BIujpHA - TechKnowledge Video: The complete history of the home microprocessor (2020-11-16)

  02:28 Real content starts.

From ENIAC to AMD Ryzen.

https://yewtu.be/watch?v=N3s0_yf2mS4 - The Computer Chronicles: The computer chronicles - Windows 98 (1998) (2013-08-01)

  11:50 Linus Torvalds the 1998 version talks about Linux.

 

 

Edmond D
Posts: 479
Joined: Thu Aug 19, 2021 1:42 am

Teaching kids at school about IT history and 8 bit computing

Post by Edmond D »



On 12/10/2022 at 10:37 PM, svenvandevelde said:




My million $ question to the forum here is, I want to bring interesting historical facts about the history of computers, and bring those up. Do you know of any interesting stories, videos, pictures about computers (even from the 50s, 60s) that are interesting to show to the kids so this whole story gets a bit alive. I want to win their interest in the subject.



I'd suggest adding a section on computers visions in entertainment. Look at Star Trek - the original communicators looked like a flip phone. Later spin offs had tablet devices and talking computers. 2001 has a computer with a complex AI that understood spoken language much like Siri and Alexia, and perhaps HAL went off. There are many more examples to use. A discussion could be based around what people though computers could do and the effect that it had on the development of what actually came. 

A final example would be of course the computers in Hitchhiker's Guide to the Galaxy. Getting a meaningful useful answer is as important as knowing what the question really is about, plus computers do make multiplying mistakes ( 6 by 9, Intel 386 multiplying bug) 





 

kaos
Posts: 23
Joined: Thu Nov 03, 2022 2:09 pm

Teaching kids at school about IT history and 8 bit computing

Post by kaos »


yewtube?

is that 4chan-slang for jew tube? replace the prefix "yew" with "you" and avoid that potential trap.

 

As a kid, I loved the in-detail descriptions of technology. So maybe more of that, how the CPU works. Like actually go into detail.

kaos
Posts: 23
Joined: Thu Nov 03, 2022 2:09 pm

Teaching kids at school about IT history and 8 bit computing

Post by kaos »


Internet is "specified" in a bunch of requests for comments (RFCs).

RFC 1 - Two computers are connected together in a lab. Nice ASCII graphics.

The following few RFCs define the IMP protocol. It had an address field of 5 bits, so a maximum of 32 mainframes could be online, each with lots of terminals.

RFC 147 - The definition of a "socket", that everyone still uses and references from all kinds of official standards. If anyone actually read this document, they would see its just random ramblings of some guy, that is completely incompatible with our usage of the concept.

RFC 791 - Definition of IPv4.

RFC 793 - Original TCP definition.

RFC 9293 - Current TCP definition.

The tone in these documents has gone from informal "request for comments" to hyper-strict formal. In the early beginnings its almost goofy. The length of the documents have also grown significantly, with the complexity.

You could say something about the pre-internet (IMP -> NCP -> IPv4 -> IPv6). E-mail, Telnet, and FTP predates modern internet. RFC 801 describes how they intended to make the transition from NCP to IPv4/TCP. It went fast: The Flag Day.

And we are now transitioning to IPv6... for the last 30 years... derp

neutrino
Posts: 182
Joined: Wed Oct 19, 2022 5:26 pm

Teaching kids at school about IT history and 8 bit computing

Post by neutrino »



On 12/12/2022 at 8:13 PM, kaos said:




yewtube?



is that 4chan-slang for jew tube? replace the prefix "yew" with "you" and avoid that potential trap.



It's one of the invidious mirrors with a short name, and thus the usage. Switched to it when youtube started all kinds of shenanigans and got fed up with it. And the ownership of youtube is what can be inferred.


On 12/12/2022 at 8:47 PM, kaos said:




The following few RFCs define the IMP protocol. It had an address field of 5 bits, so a maximum of 32 mainframes could be online, each with lots of terminals.



The IMP was built using a rugged DDP-516 (Honeywell 316) in essence a 2.5 MHz computer with 32 kwords of 16-bits. It had two type of interfaces, the 1822 which handshake every bit.. and the classic RS-232 for console etc. The 1822 interface could handle 180 kbit/s but the IMP network was 50 kbit/s. The IMP connected to a modem that had coaxial connection to the (AT&T?) nation wide network with frequency carrier coaxial cables. Btw, here's how to program the 1822 interface..

So no ordinary modem. Nor local Ethernet.. ?

The "old" network was more like some computers with serial ports cross-connect and a modem.

I would say the NCP protocol at least is recognizable in a modern sense. The previous stuff is really non-standard. In regards to email, UUCP is worth mentioning. As many institutions had this as their sole internet "connection" with a "super fast" 9600 or 19200 bit/s modem. Some still is. One should keep in mind there was or is services that would email files using uuencode etc to you. So file access was accomplished that way.

As for IPv6. Somehow it seems key actors have an interest in suppress the protocol. They likely have a business and power interest in an artificial IP scarcity. It will probably require some steamrolling to get it done.

 

Something in general to beware of is that the speed of innovation is consistently getting faster for at least 400 years. And the amount of current research is staggering compared to the beginnings. So now before people can adjust to the current innovation, the next is out the door. One key question will likely be if corporation shall control citizens or if citizens shall self determination.

 

kaos
Posts: 23
Joined: Thu Nov 03, 2022 2:09 pm

Teaching kids at school about IT history and 8 bit computing

Post by kaos »


On the future of computing:

Landauer's principle says that energy dissipates from irreversible transformations. If you delete a single bit, Kb T ln 2 entropy is released. This is the reason why computers get hot (besides imperfection).

Quantum computers are reversible, and does not erase information (until you interact with the data, which causes a waveform collapse) and therefore avoids this problems.

We are currently running up against a heat wall, caused by this process, some 25 years before Moore's law in combination with Landauer's principle would otherwise make completely impossible to make faster computers (If I understand this correctly, this is when the signal-noise ratio would make it impossible to distinguish a 1 to 0 due to the energy released - and the computer would balance at the physically possible edge of melting. So it's natural that in actual reality it happens some years before we reach the theoretical edge).

This current period we are in have CPUs barely getting faster, so instead we build them larger, with more cores, stacked on top of each others, et.c.

But eventually we will have to switch to reversible computing, and quantum computing, if we want better and faster computers.

It is possible to explain quantum computing so that a teenager understands. Everyone just zoones out when I have attempted to do it though. I seriously think this is because people have taught themselves that it is difficult, and that give up without even attempting.

So f**k it.

But see this footnote A. That footnote explain like half of the intro. The other part is this simplified version of Born's rule.

neutrino
Posts: 182
Joined: Wed Oct 19, 2022 5:26 pm

Teaching kids at school about IT history and 8 bit computing

Post by neutrino »



On 12/12/2022 at 11:15 PM, kaos said:




It is possible to explain quantum computing so that a teenager understands.



Where is this explanation to be found? ?


On 12/12/2022 at 11:15 PM, kaos said:




This current period we are in have CPUs barely getting faster, so instead we build them larger, with more cores, stacked on top of each others, et.c.



A few years ago I read about 100 GHz transistor gates working in laboratories. So a 100 GHz CPU is possible if it weren't for the heat. But I think what ought to be tried is to scrap lot's of transistors as current CPUs have billions of them, all making the heat situation bad. Some of the simpler ARM CPUs use like 33 000 transistors. A million time reduction in power minus the 50x time faster clock and thus dito higher power. So few transistors with extremely high speed.

The 6502 team did the same.. removed lot's of parts to get it simple and produce more dies on same wafer and high yield which equals lower price. But it required that the management got out of the way.

kaos
Posts: 23
Joined: Thu Nov 03, 2022 2:09 pm

Teaching kids at school about IT history and 8 bit computing

Post by kaos »



On 12/12/2022 at 11:50 PM, neutrino said:




Where is this explanation to be found? ?



If you read wikipedia about how it works, then you are reading my explanation.

TomXP411
Posts: 1761
Joined: Tue May 19, 2020 8:49 pm

Teaching kids at school about IT history and 8 bit computing

Post by TomXP411 »


Get back on topic, guys. Have the Quantum Computing and RFC commentary somewhere else, please. Also, no Invidious links, please. Use YouTube's approved domains, only. Respecting Copyright is one of the core rules, here. 

 

 

Post Reply