concept for a 21st century BASIC?

Feel free to talk about any other retro stuff here including Commodore, Sinclair, Atari, Amstrad, Apple... the list goes on!
Scott Robison
Posts: 952
Joined: Fri Mar 19, 2021 9:06 pm

concept for a 21st century BASIC?

Post by Scott Robison »



1 hour ago, voidstar said:




Oh yes, IMO QuickBASIC was something else altogether.    It was a game-changer and I could certainly see commercial programs coming from that.



For more traditional BASIC: The TRS-80 (CoCo2 at least) had the RENUM command to automatically renumber (and re-adjust GOTOs) in your BASIC programs.  But this wasn't on the original Commodore PET BASIC.  See also (later "external" solutions to lack of renumber on Commodore):  https://www.masswerk.at/nowgobang/2020/commodore-basic-renumber



Also in my notes, I was reminded about Bob Pew's notes, now maintained by David Wheeler:  (survey of languages and suitability for 6502)

https://dwheeler.com/6502/



WAY down in the above link, is a very fascinating project:    The MOnSter 6502.

https://monster6502.com/

What's most fascinating to me about it how much slower it runs than the original 6502 (1/20th or 1/50th I think) -- which is proof that "interconnects matter", the physical distance that data has to travel.  I harp on this a lot, where folks take it for granted on writing log files across a network share, with not a care in the world on what that cost ("it works for me" - yeah, but in the target integrated environment, everyone is competing for that pipe).    The servers may have 10GBps interconnects, but the end-user front ends are 1GBps straws - so shoving a 860MB XML messages down there was just not workable, mixed with all the other traffic.    Or,  like in my extreme, keep the processing in L3 cache sized chunks, as even touching main memory is too slow - the metal distance actually matters.



I've looked at all those resources recently as I've been researching for my own BASIC successor, but I am in no way trying to go as far as your suggestions with it. I want to see a more expressive language that doesn't have as much interpretation overhead as BASIC, but I'm not trying to get to zero overhead. Assembly has its place, as does C and other compiled languages. I just want to see something that can make for a friendlier / more structured experience, where portions are "compiled" or tokenized in advance (lets take the simple example of numbers in interpreted BASIC which are represented as an array of PETSCII digits that have to be converted each and every time the line is run; there should be an tokenized format for that that can preprocess the digit sequences to binary numbers that have no runtime overhead; also long variable names that are more distinct than two character variables but that can be stored in a compact format so that the runtime doesn't have to search for the long names while running).

Anyway, I have given thought before to doing an XML or JSON based "language" that represents all the usual constructs in a normalized manner, but that has a rigid definition so it could be easily translated into "any language". But even that isn't necessarily visual.

The saying is that a picture is worth a thousand words, and I do consider good programs works of art, but there is an expressiveness that has to be understood by both the human and the computer. I may just not have sufficient imagination, but text based languages are that medium that provides for just the right mix of concrete and abstract that allows us to work effectively with our tools.

I can see certain types of visual tools working well for narrow types of tasks on sufficiently fast processors with enough memory and bandwidth. I don't see how it could work well in a retro environment, though, unless we want to say "you have to give up actually developing on the machine".

I do agree that there is not enough consideration given by many developers today to how their code impacts the hardware. They've been taught to rely on garbage collected languages with big heavy libraries because programmers are bad at managing things like memory (though they never seem to think who used what language to implement their preferred language).

Anyway, best of luck. I look forward to more fleshing out of ideas with some sort of prototype so that I can better understand what you're suggesting.

voidstar
Posts: 451
Joined: Thu Apr 15, 2021 8:05 am

concept for a 21st century BASIC?

Post by voidstar »


I think I've rationalized out why this idea won't work.

The Phoenician did a good job with the alphabet, writing hasn't changed in thousands of years.  The media of stone, papyrus, paper has.  But fundamentally, to note an idea down for an enduring purpose, writing still works.   Now, ~1900, we started to have movies.   And while "a picture is worth a thousand words" indeed, it's still the case that "the book is always better than the movie" - since in writing, you can express the UNSEEN details of what characters are thinking, feeling, and surrounding details of a scene (the reason that spider lairs in that cave), etc.

So, even in movies, it begins with writing: a script, storyboarding, etc.   In programming, we use "notepad".  Our thoughts, our functions, don't need to be complete.  We can stub things out, move things about.  If I feel two spaces here helps my thinking, so be it.   Plus, across platforms and systems, the "notepad" is one of the easiest applications to port -- and needs little-to-none training.  As intuitive as paper and pencil:  grab a sheet, and start drawing.

BUT...

When software, a program, is realized into machine code - it becomes a Clockwork/Steampunk kind of device, with levers and spinning things.  It reminds me of the scene in Back to the Future 3, where Doc created that massive scary machine, with belts and smoke, to make ONE sad looking ice cube.  Or, like the original Charles Babbage machine, the orginal "program" constructed.

And that's my point: when we compile software, it translates into a precise machine that consumes time and space, and competes for resources that other programs want to share.  And I've seen some horribly inefficient code - apparently, not everyone took an Algorithms course or is aware of Binary Trees.  When the display team says they're "600% over budget" to have those features - yeah, with code like that, no surprise.  But this is nothing new, this dilemma is why there are multiple high level languages - as they try to decrease that gap between expressing extent and having it realized efficiently.

So that was the gist of the idea:  write code, as-is.  But at the same time, have the code virtually rendered in 3-space, an island clockwork/steampunk type machine to depict the in's and out's, dependencies, and it's overall "bulk" and efficiency (it doesn't have to be as precise as machine-code, just nomalized to show relative complexity to all other code).  The UNSEEN details, however, remain in the code.   And from this, maybe it helps management give better insight into what there software projects are doing - they can see the birth during the Zero-to-One transition (I don't mean BITS, I mean from NOTHING to VERSION 1), and the subsequent growth thereafter:    e.g. we're integrating the compression feature today (and watch this chunk of an island migrate over to the main code, and see the connection points -- and what "cost" that compression entails, as the combined structure is now physically {but virtually} too large to fit in a Type-1 processor, etc.).

So - coding as we know it, that free-form notepad, has to stay  [ even Microsoft has to darn near give Visual Studio away for free; any neat-tool you try to build can't compete with the zero-cost of these existing tools - that's the other reason UML failed: exotic $10k/seat tools that needed on-site consultants to use, "NO TY, get out bish" ].  But the real-time Situational Awareness on the dependencies and resources being consumed,  I think needs some attention (and I think training younger folks to keep those aspects in mind, would be a Good Thing - we're coding to a specific system, that has constraints, even today don't take it for granted that you have a Virtual Infinity computer - try and allocate 512GB of RAM to store full world DTED, so we can do full spectrum line of sight computations, see what happens even on your glorious 64-bit machine).  

We'll get there, when we need to.  Afterall, we flew a helicopter on Mars recently and saw HD video from it - amazing software is getting done, everyday.  While there is a painful shortage of software talent, it's just that there are so many exciting things we're collectively chomping at the bit to get done.  We'll get there...

 

NOTE: I'd say writing pure assembly is PROGRAMMING, but it is not SOFTWARE.  Yes, it's symbolic.  But you're effectively running patch cables to specific addresses and twiddling bits/knobs, which is admirable to watch it done by a professional of that system.  The defining aspect of SOFTWARE is that aspect of portability across platforms, with very little adjustment.  Though clearly, there is a subject threshold to what "very little" means.  So there is a distinction between a Computer Programmer (a mechanic of sort) and a Software Engineer.

TomXP411
Posts: 1762
Joined: Tue May 19, 2020 8:49 pm

concept for a 21st century BASIC?

Post by TomXP411 »



1 hour ago, voidstar said:




I'd say writing pure assembly is PROGRAMMING, but it is not SOFTWARE.  Yes, it's symbolic.  But you're effectively running patch cables to specific addresses and twiddling bits/knobs, which is admirable to watch it done by a professional of that system.  The defining aspect of SOFTWARE is that aspect of portability across platforms, with very little adjustment.  Though clearly, there is a threshold to what "very little" means. 



This is incorrect.

"software" is a collection of programs that run on a computer. By definition, computer programs are software, and software is computer programs. 

There's nothing in the definition of "software" that requires it to be portable: the algorithms expressed hand-wired patch cables used to program early computers, such as ENIAC are just as much software as a modern C++ program. 

As to "Intent Oriented Programming": I'd argue that what you're really arguing for is more formal Software Engineering. The software industry has been permeated by people who would rather code than graph out a problem, and this has the result of creating software with gaping holes in its design, huge bugs in its implementation, and inconsistent design patterns throughout. 

We don't really need to invent new terms and methods for the industry. Instead, we need to apply disciplines that have already been created. Software engineering is a mature science - but most "software engineers" are not engineers at all, but rather code monkeys. 

There's a reason my college has a Master's program for Software Engineering, which is a completely different discipline than coding. 

 

 

 

 

rje
Posts: 1263
Joined: Mon Apr 27, 2020 10:00 pm
Location: Dallas Area

concept for a 21st century BASIC?

Post by rje »


Wikipedia has a standard definition of "software", basically in line with Tom.

 

As far as software goes: it will ALWAYS be harder to read than to write.  

I think this is a fundamental difference between programming and writing.

And I don't think that will change, unless you can simplify requirements gathering and deciding.

 

voidstar
Posts: 451
Joined: Thu Apr 15, 2021 8:05 am

concept for a 21st century BASIC?

Post by voidstar »




printf("Hello World");

The above SOFTWARE manifest into some combination of sequences that apply 5v for THIS system and another combination for THAT system - one PROGRAMS those systems, via those combination accordingly.

The first use of RAM was what, 1948 (Manchester Baby) maybe?   After that point, then it was resident-memory SOFTWARE that enabled the RE-PROGRAMMING of all that wiring (otherwise it was just a fancy switchboard).

These are collectively just casual thoughts, an opinion.  Such is the nature of non-networked brains, we each have our own unique perspectives on things, accumulated from respective experiences.



My Masters was in Computer Engineering at the University of Florida.  Maybe the extra Digital Logic and Microprocessor courses gave me the tad more-than-usual appreciation of the hardware.  But you are quite right: the science and discipline for good Software Engineering is there.   I emphasize often the importance of Design, and have to take certain mangers to the side:  Don't jump to that coding phase so fast.   Preliminary and Detail Designs seem like a lost art.    Yes, "code is king" in the business world - but absolutely there is a wisdom in spending the bulk of budget on Design Artifacts for work that is intended to endure.

NOTE: We've debated about that... "If our headquarters burned down, would you rather save the code or the designs?"  Out of 10 people, I'm the only one who said Design.  And I got a beating, "The code is what runs!"  Yeah, but.... Couldn't win.  (it's a thought exercise, obviously everything is triple backup geographically separated and all that - I think one backup is even in orbit)

 

Here's a weird thought:  It once occurred to me that one could write every possible program for a system.  01, 10, 11, 100, 101, 110, 111, etc.  Walk the combinations from 1-byte to megabytes, and literally every possible program for the code-space of that system could be auto-generated (except of course you'd run out of space to hold all those combinations anywhere).   It's not a very useful thought, but I still find it amusing.   Could the most perfect PROGRAM be hiding somewhere in there?  (contrast to doing so would never reveal the perfect SOFTWARE)

 

 

EDIT:  I'll rescind my thought on pure-assembler, but for the following reason:  assembly absolutely deserves all the respect and legal protection as any other type of software.  No reason to confuse lawyers about that (not that there was, but the principle remains).  But I will  still simply say: assembler is software of a different sort.    Some software is very system-purpose-focused (e.g. hardware drivers), while other software is more abstracted from system-specifics (generally involving some kind of compromise, maybe in performance, but with a general benefit of broader portability).  

EDIT2:  But on second thought.... There are multiple ways to execute instructions.  I can sit there with wires myself, poking 5v on the bus of lines going into a processor (I couldn't do it fast enough, but the principle remains).  I could use the presence of bubbles (see Bubble Memory).  I could use smoke signals  (giving another meaning to the word VaporWare!).   Or, use RAM.   All of those entail a specific medium and a specific combination, to PROGRAM that system.  But at what point does it transition into SOFTWARE ?  I can Copyright my ASM code.  But can I copyright my hand-motions (of applying 5v here and there) also?   If I could twist my fingers fast enough, like gang gestures, that could represent hex codes that a processor understands - is that SOFTWARE?



 

voidstar
Posts: 451
Joined: Thu Apr 15, 2021 8:05 am

concept for a 21st century BASIC?

Post by voidstar »


A slightly older peer suggest what I am proposing was called Mainstay VIP for the Macintosh.  In a way, sort of.   I'm ok with scrubbing that notion of "intention based programming" - done with that (it was a means to perhaps more easily get towards what I was really proposing).  I'm proposing more of an enhancement to existing IDEs, as "add-ons" are more approachable.  We're long past 80x25 screens to code in.  Use my 7 other monitors to give me Situational Awareness about my program.  If I could just "look behind" the code, at an angle -- and see all that coupling, dependencies, etc.  We reach that point in our minds, offer a way for others to reach this point more quickly, to see ramifications of both design and implementation decisions.

 

NOTE: The 1943 novel The Fountainhead was such a good story about the pursuit of perfection in ones craft.

 

EDIT:  Interesting to me that we have this term "code monkey", as it is similar to the notion of "wrench monkey" in other areas (I think that was depicted in the TV series "The 100", where a "wrench monkey" had to construct an escape vehicle in secret-- as the situation demanded someone who could just get it done, quickly).    Or analogous to how "mechanics" were treated by WWI pilots.  They might not know the science of the gas, ignition, pressure, but they can tear down the engine and rebuild it to fly yet one more day.

Scott Robison
Posts: 952
Joined: Fri Mar 19, 2021 9:06 pm

concept for a 21st century BASIC?

Post by Scott Robison »


Speaking of engineering, I feel like scrum has been a plague in many ways. I'm not opposed to agile, and I agree with the manifesto. It is just what some companies have done to agile by the name "scrum" that really bothers me. Agile is supposed to do away with certain things that scrum seems to double down on. There is far too much "no need to think about the problem because we'll just throw it away later, we only need to do the minimum work to achieve 2 week sprint goals". The idea that code will be thrown away becomes a self fulfilling prophecy.

This is not to say that ever detail of "Formal Scrum(TM)(Patent Pending)" is bad, but IMO they just are trying to replace one set of often bad practices ("Formal Waterfall(TM)(Patent Pending)") with another.

voidstar
Posts: 451
Joined: Thu Apr 15, 2021 8:05 am

concept for a 21st century BASIC?

Post by voidstar »


Sorry to drivel on, truly.  But ONE last thought, and we'll leave this to future trolls.



My "proof" that a better-than-high-level language programming paraigm can't be built - is based on the notion that, for all these centuries, we haven't come up with a better system than WRITING to communicate ideas.   Movies are nice, but they don't (for the most part) convey the UNSEEN details (of feelings, thought, rationale, etc).   

We did come up with Calculus centuries later, as an abbreviated way to express some mathematics, and that helped dramatically.  So originally I pondered if a similar thing could apply to software - standard symbols for loops, threads, data-streaming, etc., to more efficiently express intents (instead of this Babel of programming languages).



BUT.... What about Augmented Reality?



I can wear some Google glasses and walk around a city, and above everyone's head I could see their Academic Status and Financial Status -- "books" and "$$" signs floating above everyones head (augmented by the glasses), or maybe symbols indicating topics-of-interest, clubs.   To know the names and criminal history of everyone around me, or just to know a recent history of what books they've read - maybe call this system Ice Breaker (not that Cyberspace weapon... nevermind) - or call it Deal Breaker? ("danger in that crowd")



Clearly, in contemporary times few would opt-in for this, for privacy reasons.  But it is a possibility that AR offers that never existed before - pro-ject right onto our optics, chip in eye, no glasses at all -- to augment reality with what was previously UNSEEN.   [ and that's interesting to me how WRITING might be changing - are emoji's a return to hieroglyphics? or being able to inject HYPERLINKs as footnotes, anywhere, etc., we can in theory elaborate on any specific point  {which is why "conversation" is so dangerous: I can't pause and clarify things said, nor backspace... poor politicians} ]



That's all:  maybe apply some AR to coding, to somehow show a "weight" of that code (runtime, resource usage, coupling, etc.) in context of the ACTIVE TARGET platform.  Don't muck with the existing flow of writing, parsing, linking, etc -- but offer more SA (Situation Awareness) about the "cost" of that code, the "unseen" attributes of code in some more standardized fashion than #pragma sections:  faint shapes lingering "behind the code" to indicate the relative "bulk" or "weight" for the target system.  I can appreciate a Software Purist perfecting class-relationships, but there are often missed target-specific nuances.   Still, to what end?  that "shape" is what it is - still no useful insight into how to improve things.  But at least the cost is not masked, maybe helping during integrations to see why what works in isolation now fails?

 

This is pertinent as software and microcontrollers become even more part of our lives -- from being embedded into hypersonics with 2lb payload limits, to being injected into our bloodstream, or literally woven into the fabric of our clothes.  "perfect software" to scale to all these targets will remain a necessary work.

 

Thanks for the discussion - yes, I tend to agree, existing Engineering Discipline - if followed - should cover all this.  We build (to the platforms we know about), we test, if it doesn't fit, we Spiral again.   Platform/targets are going to migrate/evolve,  business can't chase those possible futures with any single "perfect-expression" of an algorithm or intent, relative to all current and future targets.  // v*

Scott Robison
Posts: 952
Joined: Fri Mar 19, 2021 9:06 pm

concept for a 21st century BASIC?

Post by Scott Robison »


It is an interesting idea if we could come to some level of agreement as to what the various shapes / colors / non-textual cues meant. We have a problem with evolution of language already. Look at how people are beginning to object to the terms "master / slave" when used in a technological context. The words have legitimate meaning, yet culturally we evolve language to mean more or less than it did previously. We change the pronunciation of words. An excellent example is how American's used to pronounce "DATA" most typically as "dah-tuh" before the late 1980s, but we've shifted to "day-tuh" since then. Some credit Patrick Stewart's British accent as driving that over seven seasons of Star Trek The Next Generation. Other examples are harass (is it "har-ass" or "hair-iss"?) or err ("air" or "urr"?)

Extending that to shapes, colors, iconography, look at the typical "save" icon: a 3.5" diskette. Mainstream computers started abandoning it circa 1998, yet we still have it to this day, and a generation of computer users are likely as unknowledgeable about the significance of the icon as they are about a rotary dial phone.

Written language has, as you've said, the ability to include background information through exposition, parentheticals, asides, and so on. A good text editor can take source code comments and squash them out of the way so that you can view the code without the "extraneous" noise, but then you can click on something to expand it when it is useful.

As for ways to "embellish" programs, I think comments are the "best" (for some sufficiently fuzzy value of "best") way we have to augment the significance of the associated code. If we had smarter tools that could extrapolate common idioms into automatic comments, I could see something potentially useful there, but it seems like a Very Hard Problem(TM) to solve.

C++11 and later have "constexpr" expressions. I don't necessarily love the keyword syntax, but the idea is that they are more constant than a "const" (which isn't really always constant, but often is simply used as a synonym for immutable). A valid constexpr function can be used as an initializer of a value or array dimension or in other similar "real constant" contexts. Why not have a compiler / environment that, in addition to providing compile time evaluation of functions to constant values, somehow also did compile and / or link time profiling style analysis? Something that didn't require you to actually run the code but still provided "hot spot" identification of the generated code. That is also a Very Hard Problem(TM) but I think less so than AI based identification of code translation to automatic comments, as it were.

Kalvan
Posts: 115
Joined: Mon Feb 01, 2021 10:05 pm

concept for a 21st century BASIC?

Post by Kalvan »


If you don't consider that most home computers back then didn't have FPUs (making the the sole Floating Point data variable type a hair-tearing bug rather than a feature), the biggest problem of BASIC was that it was purely sequential, aside from specific jump commands.  Using commonly accepted programing conventions inherited ForTran and COBOL, if you needed to patch in more than ten lines between any two existing acceptable lines, you had to rewrite the subroutine completely from scratch, and God help you if the result then impinged on the line space of the subsequent subroutine, resulting in cascading rewrites and even more hair tearing.

There's a reason the programming world has moved on to procedural programming paradigms, and further developments from there.

Personally, I would prefer the development of a 21st century version of LOGO.

Post Reply