concept for a 21st century BASIC?

Feel free to talk about any other retro stuff here including Commodore, Sinclair, Atari, Amstrad, Apple... the list goes on!
voidstar
Posts: 451
Joined: Thu Apr 15, 2021 8:05 am

concept for a 21st century BASIC?

Post by voidstar »


I may be laughed out of town for this, but I'm going to throw it up in the air anyway.      I can't run very fast, please don't tar and feather me ?

Attached are notes on how I'd teach young people about programming.   We can use the vintage processors, but I think BASIC needs a re-think.   Understanding *how* ROM BASIC works is fairly insightful, but I think it's where younger folks quickly lose interest.   A vintage-reboot maybe could use an alternative approach on how to program the thing.    Yes, there is certainly a very decent C compiler for the 6502.   But in a way, I consider the approach proposed here as a sort of "in the middle" between assembler and C --  because I believe the approach here could export INTENTS into both assembler or any high level language.  So you can get the "feel" for the purity of assembler (at least in that "your code uses resources, be mindful about that",) but you can also run your INTENT on your Windows box or your phone {for real, and NOT in an emulator} ), or an Arduino.

 

Just tossing out ideas,

v*

 

concept of Intention Oriented programming using virtual constructs.pptx

SlithyMatt
Posts: 913
Joined: Tue Apr 28, 2020 2:45 am

concept for a 21st century BASIC?

Post by SlithyMatt »


I'll have to read through that, but have you considered making a video of you presenting this material? If I've learned anything, it's that there's a substantial audience for slide presentations on tech concepts on YouTube. 

Also, I humbly submit my own XCI game engine (https://github.com/SlithyMatt/x16-xci) as an assembly-like programming language that also lets people make games without programming experience.

rje
Posts: 1263
Joined: Mon Apr 27, 2020 10:00 pm
Location: Dallas Area

concept for a 21st century BASIC?

Post by rje »


I'm not laughing, I'm always interested in talking about value-added retro-programming.

Now your slides, sir.  With all due respect... Let me say that they are rich with detail, and I think a lot of it is useful, but perhaps not in a slide-deck sense.  So: you do need to break them down and split material out.  I recommend a triage, where you remove everything that doesn't matter.  Then, reduce the remaining concepts into tight little bullets.

Then, support your slide deck with a document that re-expands your ideas out.

 

Scott Robison
Posts: 952
Joined: Fri Mar 19, 2021 9:06 pm

concept for a 21st century BASIC?

Post by Scott Robison »


I would also suggest sharing it in a way other than PPTX ... not everyone has or even wants to use Microsoft Office / PowerPoint.

BruceMcF
Posts: 1336
Joined: Fri Jul 03, 2020 4:27 am

concept for a 21st century BASIC?

Post by BruceMcF »



59 minutes ago, Scott Robison said:




I would also suggest sharing it in a way other than PPTX ... not everyone has or even wants to use Microsoft Office / PowerPoint.



At a minimum, check how it out displays on the FOSS "Impress", part of the LibreOffice productivity suite. If there are problems, save it as a PPT and see if that displays better. They are working on the quality of Impress display of PowerPoint slides, but there is always the risk of a bad display on the free tools if you haven't checked it on those systems.

Scott Robison
Posts: 952
Joined: Fri Mar 19, 2021 9:06 pm

concept for a 21st century BASIC?

Post by Scott Robison »



3 minutes ago, BruceMcF said:




At a minimum, check how it out displays on the FOSS "Impress", part of the LibreOffice productivity suite. If there are problems, save it as a PPT and see if that displays better. They are working on the quality of Impress display of PowerPoint slides, but there is always the risk of a bad display on the free tools if you haven't checked it on those systems.



One step further would be to export as PDF ... this will allow one to keep the formatting in what is arguably a more open format.

voidstar
Posts: 451
Joined: Thu Apr 15, 2021 8:05 am

concept for a 21st century BASIC?

Post by voidstar »


Thanks for going easy on me!     Some folks can get pretty defensive about their favorite tools or methodology -- in all extremes, from the:  if it needs to be done in assembly, use assembly, duh!   To Java runs on 300 billion devices, nothing wrong with it.   I haven't really run across a "BASIC is King" person, but I suspect one is out there.     

Apologies on the very much "draft" appearance - it was typed from hand notes I had from a years ago, rather than scanning them.    

Brooks Mythical Man-Month and No Silver Bullet forever sticks to my mind.  Programming is hard, no two ways about it.     It's just.... Like back in the 60s, when they got excited and celebrating shaving just 1 opcode from a function, I wish we could still celebrate things like not.  At 3-5ghz we take so much granted.   And I don't literally mean 1 opcode these days -- but I mean removing whole statements from a high level language.

For example:  Like on page 7 of my presentation - in a more expanded version, I'd show an optimization that I normally do where no stack space is needed.   Instead of declaring local redundant copies of what I'm about to put into the database -- instead, push an empty record, then populate the database entries directly.  Obviously there are limits to where that can work (if the database needs a key, or if the data has a lot of conditions that need to be checked for validation).  But if it's just a dumb vector, go for it.

 

It's just many times - I've witnessed that "feels good" moment when you refactor some dead weight out of some code, and get a 100x order of magnitude improvement -- and that nugget was sitting there the whole time, if with just a little more insight about the code folks could have seen it more clearly (something that the "profile for heavy performance hitters" didn't detect).  But, I work in that extreme where my bottleneck is the performance of L3 cache, which I know most folks don't go there -- just there are engineering situations where critical performance is essential, but understandably not in the more casual types of programming.

 

example: kid was calling get_clock() all over the place.  I think it was an age out condition -- i.e. if this table element is older than this much time, then change it's color to indicate "old data".   So he's doing get_clock() for each element in this 100,000+ entry table, multiple times (compound condition, not just that single time-field).  And, this table needed to be refreshed pronto - real time like, important data, and presently it was too slow.  What to do.   So I said, get_clock is a system call - probably doing a block, and it's certainly not atomic since this clock has several words.    Why not call it once at the top of your age_out thread, and PASS the result (by reference) down along to your functions.  The database is locked anyway, the data isn't updating their times until after you're done.  Better yet, make it a member variable, and don't even bother passing it around.

So - he said he thought of that, but he figured the optimizer would have figured all that out.     Thing is -- technically he's right, I could see an analysis done to deduce that are all the same calls within the same thread.  But this was years ago, and the compiler had no pragmas or other indicators that this was the same thread (or in any case, at least then, it didn't have the smarts to deduce that).  Or alternatively, if get_clock() had been altered and just did something static like "return 5;" -- technically right also, the optimizer might be able to deduce that and recognize all those calls return the exact same thing.  But no, the system had an actual clock.    Usually you profile and look for your heavy hitters and go after those first - but in this case, you wouldn't notice these "cuts by a thousand calls" eating away at the overall performance.  Anyway, the age out feature was rescued.

 

 

The approach here isn't really trying to hide the high level language.   It's two things:   while coding, have two side monitors being more explicit about the corresponding assembly language.  It doesn't even really have to be precise - just a relative approximation to say "you're doing 30x more instructions now than 10 versions ago, can you afford the slowdown?  if not, maybe look into how to do it better"   and on the other side  "what resources are being consumed, whose being blocked right?"   As much as I love simple vintage computers - the modern world (since 2005) is multi-core [maybe not in ultra low power application], so organically playing nice with other programs running on the local CPU is just an aspect many developers will have to deal with.  Load balancing is just something we have to deal with integration - so I appreciate programs that run correctly, but we have to keep an eye on their efficiency.  A Sloppy Jalope program that uses 50MB/s across the bus to spin a dumb globe graphic in circles -- it's not as fun to watch anymore when you see that, since it'll impact my transcoding and image stacking, etc...

 

 

 

 

 

 

 

 

 

voidstar
Posts: 451
Joined: Thu Apr 15, 2021 8:05 am

concept for a 21st century BASIC?

Post by voidstar »


PDF version.  Very much a rough draft.

I'll try to get it more organized, might not be till Fall though.  Travel plans coming up.

concept of Intention Oriented programming using virtual constructs.pdf

Scott Robison
Posts: 952
Joined: Fri Mar 19, 2021 9:06 pm

concept for a 21st century BASIC?

Post by Scott Robison »


Thanks for taking the time to put it up in PDF.

I've long contemplated visual programming but have never come up with anything I think is "good enough". My wife works at a middle school that has a "creative coding" class (I think it's called) that is just very simple introduction that uses a system they call "code blocks". It gives them a list of the javascript primitives and allows them to drag and drop them into a "function" pane, reorder with a mouse, and edit parameters a specific block might take (like loop or if conditions).

I like the idea of decoupling the code from a "rigid" text format that allows the students to get a feeling for things before expecting them to get syntax correct and so on.

Based on my quick perusal of the PDF file, I think the problem with it is the amount of effort that has to go into the tool by experts to create something usable by the novice, and people potentially being put off by terminology like "intent oriented programming" or some such.

I've been thinking through a "modern BASIC" for the X16. My hope is to come up with something that could be in the ROM and available at powerup without having to load it from a file. Something that would "abstract" the address space of the X16 into a more linear looking thing rather than banked RAM / ROM + conventional memory. Something that could support cooperative multithreading, multiple "processes", fully relocatable tokenized code, and "virtualized" access to the hardware (particularly the display). Something that has better integrated debugging capabilities.

I don't know what will eventually come of it, as it is something I am restricted to working on in my spare time, but it would provide more functionality than BASIC v2 and provide an environment that could support multiple programs running at once. No, it won't be a speed demon, but it's intended for those who do not want to have to do everything in assembly.

As for the question posed about commercial BASIC software ... yeah, a lot. https://en.wikipedia.org/wiki/QuickBASIC was used in commercial software development. A former employer, Clark Development Company, released multiple versions of PCBoard BBS software developed using QuickBASIC before eventually migrating to C and assembly language somewhere around version 14 or 14.5 (or maybe 14.5a ... it was over 25 years ago now and my brain isn't as nimble as it once was).

https://softwareengineering.stackexchange.com/questions/149457/was-classical-basic-ever-used-for-commercial-software-development-and-if-so-ho answers the same question on a broader basis.

Now, a big part of the problem with answering this question is what qualifies as "BASIC"? Do you mean strictly line numbered BASIC running a slow interpreter? If so, then less software (though not zero). I think the development language is independent of the eventual delivered program, though. QuickBASIC allowed compiling to DOS EXE so that one didn't have to own the purely interpreted environment. Also, as we've seen much over the last 20 to 30 years, a slow interpreter is not necessarily a stumbling block to commercial software. A significant percentage of the web exists due to strictly interpreted languages (though more powerful than BASIC) such as PHP and Javascript.

Anyway, I don't mean to poo poo the ideas. Just sharing my thoughts.

voidstar
Posts: 451
Joined: Thu Apr 15, 2021 8:05 am

concept for a 21st century BASIC?

Post by voidstar »


Oh yes, IMO QuickBASIC was something else altogether.    It was a game-changer and I could certainly see commercial programs coming from that.

For more traditional BASIC: The TRS-80 (CoCo2 at least) had the RENUM command to automatically renumber (and re-adjust GOTOs) in your BASIC programs.  But this wasn't on the original Commodore PET BASIC.  See also (later "external" solutions to lack of renumber on Commodore):  https://www.masswerk.at/nowgobang/2020/commodore-basic-renumber

 

Also in my notes, I was reminded about Bob Pew's notes, now maintained by David Wheeler:  (survey of languages and suitability for 6502)

https://dwheeler.com/6502/

WAY down in the above link, is a very fascinating project:    The MOnSter 6502.

https://monster6502.com/

What's most fascinating to me about it how much slower it runs than the original 6502 (1/20th or 1/50th I think) -- which is proof that "interconnects matter", the physical distance that data has to travel.  I harp on this a lot, where folks take it for granted on writing log files across a network share, with not a care in the world on what that cost ("it works for me" - yeah, but in the target integrated environment, everyone is competing for that pipe).    The servers may have 10GBps interconnects, but the end-user front ends are 1GBps straws - so shoving a 860MB XML messages down there was just not workable, mixed with all the other traffic.    Or,  like in my extreme, keep the processing in L3 cache sized chunks, as even touching main memory is too slow - the metal distance actually matters.

Which is where I'd like my paradigm to go -- to real-time visualize things like how much percentage of a network-pipe you consume.  It's fine if you need 500 bytes or 5MB -- but just be aware of it, and understand that at a certain threshold, (for example) maybe your application is no longer suitable for wireless configuration -- and "here" is why: you left in some debug logging even in the release build.

 

In a way, maybe the concept is more like programming as if you were inside a TRON-type VR environment.  It's the system perspective that programmers should have in their minds eye - except there are SO many things to keep track of, now-a-days.      Back to the earlier point on MOnSter 6502: it's admiral to build a "super-sized" physical 6502 that actually executes instructions.  But what if they virtualized that entire thing?  Model all that logic in a "CAD" or "3D" or "VR" space, and see what kind of programming-languages evolves within that space.   e.g. someone plops in an ADD instructions, STA STB to some addresses -- hey, I can "see" my program growing now, "building" software instead of "writing" it, and the system impacts can be more easily summarized (as larger structure INTENTS would correspond to a sequence of instructions).

Pretty rad.  But it also made me realize this about where 3D Printing is going:  why physical-build anything anymore, just VR model stuff, and only make it physical when absolutely necessary.    I read a lot about the 1851 Great Exhibition in London (so sad it burned down in the 1930s), where around that time was a big debate about traditional "hand crafted" products, versus stuff produced using mechanized assistance (and piano companies that were too "made-by-hand" prideful to adopt to those new processes, they died out...).  I consider that 1851 event being a focal point in that paradigm change, a clear entry into modern consumerism.  A thing made, like a fancy statue, was unique - until that decade, where they began to come up with mechanized processes to clone things (but with inferior materials).   My grandparents liked to collect Remington statues.  But in the modern age, why have physical things?  Just virtualize it in a model.  And if you really want it, 3D print it.  That's where we're headed - RIP antiques (maybe).

So I'm just wondering if, in some way, something similar can apply to software itself.  Why commit to outputting 15,000,000 lines of code (an airline ground control station, perhaps), that's mostly going to rot, not be maintained, and will become more and more of a tangled mess as more and more of the original developers move on? (so programmers are analogous to ancient monks translating the prior sacred-text to whatever modern target-of-the-day happens to be? re-inventing the wheel of some function expressed in QuickBASIC to Java)   Focus on the core intent - and represent that intent in a more approachable and maintainable fashion (some shape in 3-space).  And when it's ready - when you really need it -- "print it" to the language suitable to your target (whether embedded space/medical device, desktop, "mobile").  The concept here isn't a new language, but a 3-space visualization of the language (to help trace the relationship between declarations, regardless of what the names are across function calls, and to help visualize the growing resource-requirements of that code - to help avoid being excessively sloppy from the get-go, or why running this ROUTE PLANNER and this TRANSCODER at the same time will be a painful experience on THAT target, etc.)

 

EDIT: Code:Blocks is fun - just, I think it's important to depict that code runs relative to a system. Don't take for granted that you even have a display, or that you have a 5G pipe, or that you have 3+ GHz, and scaling to your number of cores without hogging the system, etc.    So much to track... I certainly understand that introductory tools need to be very bare bones.  Just I think with a relatively simple system like the PET, that's kind of perfect to swing in these concepts -- a couple registers, a small address space, some IO ports...

 

 

 

 

 

v*

 

 

 

 

Post Reply