fsr help
all because the gemhost disallows requests from http proxies
### programming-languages
before i was interested in computers i was interested in electronics, and before i was interested in electronics i thought (as a very small child) i would be interested in chemistry, because scientists (in cartoons) made what turned out to be chemistry (i just thought it was "science") look REALLY COOL.
the reality of electronics and chemistry (and electrochemistry for that matter) is that doing it in any sort of actual serious field of study requires a particular level of skill with math. that didnt bother me, i just figured i would learn math at some point. i would pick it up along the way, like everyone else did.
and the years of school that tried to force me to learn it didnt work out that well. its not that i wasnt interested or that i didnt try, i just dont speak the language. youll see how incredibly relevant this is to programming when i talk about grace hopper.
but the thing is, no matter how tenaciously or obsessively i pursue a topic in other contexts, if you put equations in front of me and theyre more than 30-second affairs, i just get lost in the syntax. i get lost in the process, and the task simply eludes me.
when youre presented with an equation, there is so much information that i dont have readily available that is necessary to deal with that equation. most people manage to learn math anyway. i certainly did, up to long division and basic algebra. i did surprisingly well with geometry, but not well enough that the NEXT level of geometry wasnt too hairy for me. this is really not where i shine at all.
anyway, if i do understand the syntax of an equation, the overall form of it generally lends itself to a particular "trick" or method of solving each part, which im generally poor at identifying in the first place, and then i dont remember it because its something that i never used, and it taxed my interest to the point where i just wanted to go home. in laymans terms, i kind of suck at math, to put it nicely.
sometimes you get a really great teacher who makes some esoteric process incredibly simple by presenting it in a way that it just clicks, and thats awesome. such cleverness (on someone elses part) helped me finally get beginner level algebra, but thats about as far as it ever went.
at least i like to think of myself as proof that you DONT have to be good at math to be good with computers, but its also possible that im proof that you dont have to be good with computers (and certainly not good with math) to write a programming language.
there must be at least one frustrated person who, looking back on the things ive done, would point to the code ive written and say "THIS IS MATH! this IS math!!!" and thats fine, i mean, theyre not necessarily wrong. code is, ultimately, a layer of abstraction over ENTIRELY numeric processes. you dont have to be good at math to understand that.
i didnt get into chemistry or electronics because theres too much math involved, but i got into computers and code because the Math is Different. and because any problem you can break into steps (i realise thats part of doing conventional math too, except that ive never been able to transfer the skill) you can make the computer Do The Steps For You.
and most problems have some library that has already broken a problem into steps AND can do the steps for you. but you still have to know it exists, and how to talk to it.
grace hopper worked on computers before computers were any fun, and when you wanted to do something she had to break it into steps like everyone else, except the steps we have to break things into (because of people like grace hopper) are giant, pyramid stone-like steps, whereas the steps she had to break every task into were like a microscopic atomic crystalline lattice structure.
okay, theyre not quite that far apart. there wasnt enough ram on the mark i to cover that sort of distance. but it was tedious! it was MACHINE CODE, and you really had to hold the computers hand through every step with every individual piece of data.
at least to the point where hopper noticed she was writing ALMOST identical code over and over and over, to do "everyday" (or "routine") steps. and i dont think shes the only person who noticed this at all, it was probably one of the more universal revelations of early computing, but what she did with this issue was relatively unique and we can thank her for modern programming.
im not telling you this because its a fun bit of trivia, but because if you want to understand code on a sort of universal level, this should most likely help. i mean, it wont teach you every language. many computer languages are very tedious and full of concepts i dont get. but it will at least help you understand the idea so you appreciate even the concepts you dont understand- as serving someone.
not to mention that, in practical terms, some languages are really just an experiment. i wrote one once just to make a very particular point to a single person, which he missed. it could reasonably be called a failed experiment, but i was trying to get through to a person who is impossible to reach and i dont feel too bad about the outcome. it very nearly proved a negative.
we keep finding more and more elaborate ways to do this, and you have "complex instruction set computer" (cisc) and "reduced instruction set computer" (risc) processors, but those "instruction sets" are ultimately sets of those "atomic lattice" level codes that each have a number, and you basically give the cpu (through a series of electronic pulses, against a timer) a number for the instruction and then more numbers for data.
thats it, thats all the computer is doing. a joke ive heard is that computing is nothing more than "counting to 1, really fast" because the switches are base 2 (0 or 1) and after youve counted to 1 (for each place value) it rolls over and the next place value becomes a 1 while the place value youre on rolls back to 0.
which is WAY, WAY easier to appreciate if you make a little odometer using a paper tube and some strips of paper around the tube, with half the strip coloured black and the other half coloured white.
if you have 10 places on each strip with 10 numbers, you have a standard odometer that counts in base 10, and if you add "a, b, c, d, e, f" after 0-9, you can see how hex codes work. but if you use them to represent colours in html (or css) code, you sort of already know this. hex (short for hexadecimal) codes are base 16, because each place value has 16 positions: 0 through 9 plus 6 letters. binary has 2 positions: 0 and 1, and octal has 0-7.
if you ever strongly preferred hex codes to rgb() syntax for website colours because Theyre More Efficient, you might understand why we dont just stick to decimal 0-9 or stick to binary. most of the time when you see "binary" anything, it will be in hex codes, because 00 through ff is a very convenient and uniform way to edit long patches of numbers ranging from 0 to 255. not only is 0 to 255 the exact range of numbers you can represent with 00 to ff, its also the exact number of combinations you can represent with 8 bits, or an octet. very few things in computing are arbitrary, least of all the floating point precision.
but as far as i can tell, its really only for convenience that we use these different ways of representing numbers.
but the instructions, theyre things like "take this numeric value- put it in this numeric spot. then take this other numeric value, put it in this other numeric spot. now add them. if the sum is greater than this number, go to this other place and get the numeric instruction from there."
so then we do things like reduce the letter "a" to 97, which is what ascii coding did (and unicode does for backwards compatibility) and how does it "know" if 97 is a letter or maybe a particular intensity of red in html hex code? CONTEXT. specifically, different parts of the program handle data in different contexts. the computer "knows" 97 is a letter or colour code because its already running the part of the code that either works with (expects) a letter or works with (expects) a colour code.
like if you are standing in an ice cream shop, you know you arent about to ask the person behind the counter for car parts because youre standing in an ice cream shop. whats more, is that if you were to ask for car parts in an ice cream shop, the person who works there would probably either express flat out confusion (some functional equivalent of "bad input" or "illegal value") or take it in stride and tell you that "sorry, we only have ice cream here". and this is what error codes are generally like and also the direction theyve tended to evolve over the years.
but also, the part of the code that you wrote to accept a letter isnt going to do anything else, because it does exactly what its written to do. so you might wonder how people get the computer to do something else than what was intended?
i mean, the way thats done isnt actually to get the computer to do something other than what it was told. its literally to insert or change or replace some of the things its told to do with other instructions.
and since that sort of code insertion relies on exploiting (getting under a layer of abstraction) the way the computer handles certain processes (the processes in question being referred to as "code execution" or "the execution of code" or "executive processes"- and really NOT UN-like the "executive function" many of us lack on those days when we Just Cant) by finding a "gap" of codespace that wasnt supposed to be tripped (a buffer overflow) and then putting code there as excessive input data, or tripping code from some other part of an existing program (return oriented programming or "rop") preventing those exploits involves moving things around so theyre harder to find automatically (address space layout randomisation, or "aslr") and handling calls and returns with better accounting.
absolutely NONE of which you need to understand to understand coding, unless youre working on security- which, most coders dont. most leave that to the security people. but if you were interested in that sort of thing, you would eventually learn more about that sort of thing. its beyond me. i know about buffer overflows vaguely, and ive heard of rop gadgets, but the previous paragraph is one of the very few things i actually had to look up- just so i could tell you how rop gadgets are dealt with. the "better accounting" method im referring to is called "retguard", which horribly might even be an acronym, and i didnt decide to stick around to read more about it. but you might.
the thing is, no one really wants to write machine code. they prefer abstractions. i mean, if you want a sandwich- machine code isnt like explaining what to do with bread and other ingredients. its like you just want to say the NAME of the sandwich, but instead you have to sit there and explain exactly where to place each individual piece of lettuce. the exact quantity and placement of mustard. and simply naming the sandwich isnt an option.
maybe you like the degree of control over how a sandwich is made, but if you had to do every single thing in your life that way- describe every step in the smallest level of detail, youd understand why machine code is awful to work with. or you could just work with machine code, and see for yourself.
the next step up from machine code is assembly. assembly works a lot like machine code, except instead of having to look up the numeric code for the type of lettuce you want, theres an obscure three letter abbreviation for it. if youre lucky, its something obvious like LTC. if youre unlucky, its something that only makes sense after its explained, like RLL.
you still have to use numbers to say exactly where you want the lettuce. probably the bread is allocated space according to its height and width, and each number represents an area of the bread starting at the upper left and moving right and down to the lower right corner of the bread. eventually you realise lunchtime is over, and youll have to finish detailing the sandwich tomorrow.
so the first thing grace hopper did (and this much was probably more universal) was start writing "recipes" of machine code, where you had all the numeric instructions laid out already, and all you had to do was tweak a few values from a (handwritten) table to get the sandwich you wanted. this way, instead of thinking about every tedious numeric instruction for a sandwich each time, you just looked up the "recipe" and plugged a few numbers in, and got to work entering all the codes into the computer.
each recipe became a "routine".
computer scientists surely noticed this pattern, and hopper wasnt the only person to notice and take advantage of "routines", but there was no established law for what to DO with them.
on the machine code level, everything is just a number. on the assembly level, things are largely similar but now each basic instruction has a little letter code, which is better than looking up a numeric code for "copy this number to another location". for example in assembly the most universal code for "jump to another place in the code" is JMP, which in basic is called GOTO (or at one point, "GO TO") but hopper is the person who decided that we should be using WORDS, not squiggly equations of esoteric symbols, to write code all day.
and by squiggly equations of esoteric symbols, i mean squiggly equations of esoteric symbols:
=> https://en.wikipedia.org/wiki/APL_syntax_and_symbols#Monadic_functions
which, if you WANT to do that, you should absolutely be able to. people who do things like that for fun are called esolang authors, for writing deliberately esoteric programming languages. grace hopper didnt want that- she wanted programming to be more accessible. she argued for using WORDS, not symbols and translating THAT back into machine code.
i mean, i will argue that she had basically the right idea, that it was the ultimate boon to (and the beginning of) modern programming, and most certainly the reason basic was developed at dartmouth, paving the way for magazines in the 80s full of code you could type into your cheap microcomputer and get some kind of game from the experience. which inspired the next generation of coders and made programming more pleasant.
but i think the truth is that the ideal programming language for MOST people is actually a "balance" of maybe 70 percent hopper style code and 30 percent "esoteric symbols", which, ideally come from the character set of a classic typewriter and not a keyboard designed specifically for apl.
this is just my opinion, of course.
you may be wondering, how in the austin-powers-like-era of computing people were supposed to make computers understand WORDS instead of symbols.
machine learning actually goes back pretty far, but to do much thats useful (even to recognise handwriting, like some interesting hardware in the 90s was capable of doing) requires more processing power than you could probably stick in most early minicomputers or a commodore 64. and yet the entire basic programming environment was included on the c64, on no less than a single chip, which interfaced with a simple 8-bit mos 6510 cpu and less than 66 THOUSAND bytes of ram.
to put that in perspective, the plaintext (most likely in utf-8 encoding, but with these characters thats basically ascii) of this chapter up to and including the paragraph before this one would not fit in the commodore 64s ram more than FOUR TIMES. five copies would be more than 65,536 bytes.
basic doesnt "translate" basic syntax and words like "PRINT" or "CLS" or "LET" to machine code using an llm, it basically assigns a number to each command, and the number corresponds to a routine like in grace hoppers dictionary of "recipes", and the "parameters" or "data" that go with some commands are encoded as necessary- using very mechanical, VERY simple, non-llm based functions.
like if you had a command such as GETICECREAM "chocolate" the translated program would have a number for the GETICECREAM command, it would go to the GETICECREAM part of the code (you didnt write the GETICRECREAM feature, but it will be included in the translated code from whatever code dictionary the programming language included it from) and either the GETICECREAM part of the code knows how to handle the digital pattern (series of bytes) that "chocolate" represents, OR, it "passes" (copies) the data we wrote as "chocolate" to ANOTHER part of the code that does something with it.
its such an incredibly mechanical process that i like to compare it to this beautiful machine:
=> https://inv.nadeko.net/watch?v=IvUU8joBb1Q
Wintergatan - Marble Machine (music instrument using 2000 marbles)
so you might think that i expect you or anyone else to memorise all this and then some time when youre middle aged, go out into the world and apply it. but i feel bad for anyone who does it that way.
the way i would recommend to any "ordinary human being" including but not only us, is pick a simpler language (or invent one, ill tell you how) and build with that until you actually want to learn a more sophisticated one.
in unix-inspired systems theres a command called awk, and the a is for aho and the k is for kernighan. i think theres an "h". not only was he instrumental in documenting and helping people learn "c" (i believe the famous "k&r" c book is kernighan and ritchie, ritchie being the author of c itself) but he also invented the hello world program.
the inventor of the hello world program is well aware that a hello world program is, as a program, basically useless. you can make a computer say "hello world" by simply typing it like i just did:
### hello world
and writing a program to do that FOR you is, very possibly, a ridiculous waste of time.
but the point of a hello world program ISNT to make the computer SAY "hello world", its to demonstrate a ridiculously and deliberately simple task in whatever language is used to achieve said task.
the actual code of hello world is the point, and not what it DOES.
most programs are the other way around- what they DO is the point.
it was actually a feature of dtss, rather than basic, which introduced line numbers to the language.
the whole thing was developed for (paper) teletypes, because at the time the same amount of money would buy either a bunch of teletypes or a single "glass teletype" or monitor-and-keyboard-in-one that talked to the big, big, big ass computer over cables or phone lines.
if youre working on paper, every time you type something you use up the line you wrote it on. then the computer replies, using up another line. the paper keeps moving up, you keep finding yourself on a new line, so if you wanted to say something like:
```
hi.
this is a program.
okay not really, this is just ordinary english.
you get the idea though.
```
and you want to say "go back to the second line and change 'program' to 'line of text'" the computer would say:
I DONT UNDERSTAND ANY OF THAT. THIS ISNT CHATGPT, ITS JUST A SIMPLE COMMAND INTERPRETER.
actually it would say something more like:
BAD COMMAND, TRY AGAIN.
but it wouldnt tell you to try again, it would just tell you in very few words that it didnt understand your input, at which point you could learn how to talk to it or just give up.
so if you add line numbers, the computer has something specific you can reference later:
```
1 hi.
2 this is a program.
3 okay not really, this is just ordinary english.
4 you get the idea though.
```
now you can say EDIT LINE 2 and the command interpreter knows that one, so it spits line 2 back at you:
2 this is a program.
and under that you type:
```
2 this is really important text.
```
so now what the computer is storing instead is:
```
1 hi.
2 this is really important text.
3 okay not really, this is just ordinary english.
4 you get the idea though.
```
as glass teletypes or "video terminals" became more affordable and common, there were special codes (still emulated by most "term windows" and hence the phrase "terminal emulator" as it emulates these old machines and the control codes) which would tell the cursor to do things like move to the beginning of the line, clear the screen, change the colour of text, move to position row, column (or in more graphical terms: y, x) and input routines started letting you do things like hit ctrl-w to erase the entire word you just typed, if for example you wanted to type a different one instead.
if you ever wonder, a term window is (as just mentioned) called a "terminal emulator" and a vt is a "virtual terminal" but they do the same thing, a vt is just full screen on a framebuffer (or if there is no framebuffer, then the native text output of the display card).
you can also connect a real tty device to the serial port (if your computer has one. i dont know how this works with usb-to-rs232 adapters, but the "s" in "usb" also stands for "serial" because thats how the data is pulsed over it) and use your computer the old-fashioned way. most people prefer a separate monitor and keyboard, and they tend to weigh less and be easier to find. not to mention that you dont have to keep feeding paper to them.
as to why line numbers tend to appear in multiples of 10, thats because its a cheap hack to make it easier to insert lines between the "first" and "second" line. you would call the first line 10, the second line 20, and if you inserted a line 15 then you still had lines 11-14 and 16-19 available for future edits. you could also renumber every line, which some basic implementations managed for you but most made it necessary to use bigger line numbers.
decimal (really, floating point) line numbers wouldve also fixed this, at a cost of not only being more painful to read and reference but also costing more in terms of processing at a time when processing was that expensive, but ultimately the issue of mandatory line numbering was solved by naming line labels (or just naming functions and subroutines) when it was necessary, and developing editors based on the capabilties of a glass tty, rather than a continuous-feed paper teletype.
line numbers still appear in editors and error messages, but instead of being statically assigned by the author theyre dynamically assigned to whatever text happens to be on that line in the current version of the code as it is.
its my personal opinion that the best way to learn programming is to write your own programming language, so that will be the next chapter:
=> writing-your-own-programming-language.html writing-your-own-programming-language
=> https://fsrhelp.envs.net/
(back to the main page)
=> https://portal.mozz.us/gemini/fsrhelp.envs.net/
(it wouldve been cooler to do it this way instead)
license: 0-clause bsd
```
# 2018, 2019, 2020, 2021, 2022, 2023, 2024, 2025, 2026
#
# Permission to use, copy, modify, and/or distribute this software for any
# purpose with or without fee is hereby granted.
#
# THE SOFTWARE IS PROVIDED "AS IS" AND THE AUTHOR DISCLAIMS ALL WARRANTIES
# WITH REGARD TO THIS SOFTWARE INCLUDING ALL IMPLIED WARRANTIES OF
# MERCHANTABILITY AND FITNESS. IN NO EVENT SHALL THE AUTHOR BE LIABLE FOR
# ANY SPECIAL, DIRECT, INDIRECT, OR CONSEQUENTIAL DAMAGES OR ANY DAMAGES
# WHATSOEVER RESULTING FROM LOSS OF USE, DATA OR PROFITS, WHETHER IN AN
# ACTION OF CONTRACT, NEGLIGENCE OR OTHER TORTIOUS ACTION, ARISING OUT OF
# OR IN CONNECTION WITH THE USE OR PERFORMANCE OF THIS SOFTWARE.
```