fsr help
all because the gemhost disallows requests from http proxies
### layers
> What we really need is a cyberpunk renaissance.
its no coincidence that this is called "layers"- you want to be lain, and chapters from lain are called "layers".
if only more people wanted to be lain.
so george carlin is at least a little famous for pointing out that words and phrases tend to get less cool and meaningful as they get drawn out. he only meant this in terms of euphemisms, but it applies to at least some other words and phrases:
### programmer.
depending who you ask, "programmer" is actually a cool word. there are famous programmers who insist theres a difference between "programming" and "coding", and you can let them make their point and learn something potentially useful about history, but its easier to say youre a "coder" and so more cool people will probably say that unless theyre trying to be fancy.
but it varies.
either way, sometimes it might be better to call layers "abstractions".
you can still call them layers. its a fact that i dont know the people involved in writing lain, but referring to each part as a "layer" is most likely a reference to Network layers, such as "protocol" and "application" and "hardware" which become important (this list of layers) if you learn more about networking.
but this section (which you could also call a layer, if you want) is using the concept of layers more broadly than network layers, as a way of talking about abstractions.
so what is an abstraction? an example is easier than a definition, for something so incredibly broad.
an abstraction is anything about a computer that has "something else" UNDERNEATH it. in other words: a layer:
but the layers in networking are formal and very technical and specific and well defined, where abstractions in computing can be infinite in number and sometimes become vague and very, very often unimportant.
the point of an abstraction is generally to make things easier- simpler, than whatever is underneath.
and generally speaking, abstractions are your friend!
but you can have too many of them, then abstractions become a nuisance.
the essence of hacking (in the broadest sense: "messing with computers") is to get through an abstraction to the (metaphorically) meaty goodness underneath. if you are vegan, then this still applies to things like... coconuts.
theres a debate about what the true definition of hacking is, and whether "messing with computers" in the legos and tinkertoys sense ought to be called "hacking" while "messing with computers" in the sense of "youre not supposed to be here" and "this probably isnt legal" gets its own terminology.
i wont make any attempt to settle that debate, to me those are two sides of the same coin but im still using the word in the broader "im a scientist and want to understand this stuff REALLY REALLY well" sense of the word. i wont be telling you how to do crimes. if i even have an area of expertise, you would be better off asking someone much more talented than i am.
we are (as far as i know) going to be "messing around with computers" in the sense of: this is your stuff and you can do what you want with it. you can absolutely use this as a stepping stone to learning about computer security, but i still wouldnt be the person to go to for learning about that in any level of useful detail.
the fun thing though, is the two things are related. you could take a bunch of shortcuts, learn very little about computers and just go get a bunch of tools and learn how to pick digital locks with them. that doesnt interest me personally, but its one approach.
for me personally, learning about computers and how they do things is more interesting, which you can pick up from the starting point of lock-picking but im more interested in making things.
like i dont want to break into my own computer and take over the stuff there, because its more work. i want to remove everything (which admittedly, these days does seem more and more like breaking into your own computer, but thats why i dont knock people wanting to learn that stuff or people who are good at that stuff) and replace it with whatever i like or prefer.
mostly because computers are supposed to be able to do that, and when they cant i wonder why.
but lets talk about layers, in the informal abstractiony sense, because layers ultimately are what computers are all about.
in the most pedantic sense, computers start (at least as much as anything else) at the quantum level and work their way up to the atomic and molecular. this may seem like a joke, but for many years now your hard drive (whether its a usb stick, ssd or good "old" metal-oxide-goes-spinny model) uses quantum tunneling. dont take my word for it, go look this up. i dont believe it either, but it appears to be factual and real somehow.
if i were any good at math, i would have gotten farther into electronics before i even bothered with computers. computers of course, are (usually) made of electronics. sometimes theyre made of cardboard circles or bits of wood, but when we think of computing most often we are talking about electronics. at least i know what an electron is. but what it does in its spare time is way beyond me. you need LOTS of math for that.
so whatever your computer is doing at the quantum level, i dont know anything about it and most people also dont know anything about it. and im not even talking about "quantum computing", in the golden chandelier kind of sense. i mean YOUR personal computer and the physics IT is using to do basic things like read and write data. but because there will be situations where this is RELEVANT (not many) you could say that modern computing actually starts here.
and i didnt mention that just to be cute, i mentioned it because part of real genius is getting from one layer where people expect stuff to happen to a lower layer (less abstraction) and making things HAPPEN there, on that level.
on your own personal computer, i dont think (i dont actually know) that quantum physics started being part of the design process for any other reason than making computers do more and more and more involves making components (or the individual parts of components) smaller and smaller and smaller, until conventional newtonian physics actually stop working the way we want and we have no choice but to use what we know about physics of things that are smaller than most things newton worked with (not including magic, because newton was into kabbalah as well- his sister told him to avoid publishing on that topic however, because she figured everyone would think he was crazy).
so if we wanted hard drives to get a bit smaller and the data they stored to get a lot greater, eventually each bit became small enough that quantum physics do be happening at the point of read and write heads. once again, do not take my word for this. you can look up the bits about newton as well, one of the books he kept on the topic is in a museum in england.
but while on the topic of bits, most computing is done with groups of them. a group of bits is called a "byte" (as you likely already know) and mainframes experimented with several different sized bytes, which they petulantly insisted on calling "words" instead.
and then there were different standards for exchanging data- at one point, ibm used "ebcdic" which, thats ibm for you, but morse code seemed a little TOO archaic so it was slightly upgraded to baudot, which those typewriters i mentioned liked using until ascii was invented. baudot was a 5-bit code, ascii was 7 originally, but most computers used AT LEAST 8 bits (an 8-bit byte being not-as-commonly referred to as an "octet") so clever people figured out how to extend ascii coding to use the extra bit, because thats more fun.
but this wasnt non-standard or confusing enough, so at some point a bunch of people who were WAY too clever for ANYONES good decided that adding a bit to ascii was quaint, why not add an entire byte instead? thus so-called "unicode" was invented, and (as youre not reading this now) obviously made it impossible for data to be exchanged ever again. but some people were relieved.
despite these protests, ive never used an 8 or 16 bit computer without wanting at least 32-bit features, and these days if you want a decent browser you pretty much have to run at least a 64-bit system to run it on, which i find pretty disappointing. at some point, no one wants to maintain both a 32-bit AND 64-bit codebase IF the program is large enough, and for browsers- i mean my operating system is a lot more lean than anything legally allowed within hundreds of meters of the mozillaplex. which is actually github, i suppose.
but i digress. after the quantum and atomic and molecular level, which lets face it ALMOST no one writes code at these days, you have the logic circuits and the physical hardware theyre part of. most people just lump all this together into the "hardware layer", which is pretty convenient until youre actually building something.
there was a time, and this isnt a joke, when people actually toggled the bios routines (theyre called uefi or efi now, but people still call them the bios, that much you CAN trust me on but im not saying you should) but people would actually sit and flip switches on the front of the computer to "tell it how" to start loading data or loading a program.
this became automated with paper tape and magnetic tape and finally with bios (rom) chips which today are on flash chips. because theyre on flash they can be updated, and because they can be updated they dont try as hard to make sure the code is "just right" because they can fix it in post. thats a horrible pun (which i didnt intend, not that i expect anyone to believe me) based on the term "power on self test". but you can call it the "logo (oem) screen" if you like.
but to make a ridiculously long story a ridiculous story thats slightly shorter, the hardware is distinctly a layer, theres "firmware" (of which bios is an example) that LOOKS exactly like hardware but is arguably really more like software (firmware is sort of like the tomato of computing) and when you turn the computer on (not unlike what i mentioned in the introduction) and it actually DOES something before you have any "software" or "operating system" installed, THATS the firmware doing it.
so hardware -> firmware -> software, but wait...
you COULD just "write" a program that runs directly on the computer, and people used to, in fact i dated someone who writes code like that. but basically thats how you get into os development. you have the firmware (uefi, which as i said is pronounced "bios") load a bootloader program, the bootloader program tells the computer where to find your "software" (operating system) and just starts running the code.
and the code starts telling the computer how to do all the (this is just my opinion, but im not judging as much as it sounds like) unbelievably tedious and boring details of where the keyboard is (before you even worry about a usb stack- why a "stack?" because its made of layers) and how we are going to talk to the power management, and no matter whether your computer is 32 or 64 or (brought here from the future) 128-bit, the operating system has to spend an unreasonable time pretending its a 16-bit ibm at (or "286") unless youre using risc architecture, then im not sure but i wouldn't put it past them.
i actually like the intel/ibm ecosystem but thats because im a coward. theres not much to like except that it works, because everybody had one until we learned why no one would ever want one. but by then it was too late. it was practically the law. and the law is far more important than whether it ACTUALLY "works" as you might prefer to think it does.
but you could also write for the risc architecture(s), in theory.
your "os" (or demo that puts text on a screen, and maybe lets you type some letters or access a floppy drive- you did remember to install a floppy drive, right? not a usb one, you dont have usb access yet) pokes data at specific places the computer knows because it was designed that way, and then letters appear on the screen until you access a framebuffer so you can start to do things that REALLY hurt.
now, a lot of people are going to think im saying all this to discourage you from os development. that couldnt be farther from the truth. i have, this is the most sincere thing i can say on this entire page, the utmost admiration and respect for people who have the patience and knowledge to work at this level of the computer. there wouldnt be much point in talking about layers or abstractions at all, if i didnt talk about the layer where operating system development actually happens. this is that layer.
its in my opinion, incredibly tedious and detailed and exactly the sort of thing i NEVER want to work on personally, but i have actually worked on it. not- i mean, you go to do the osdev website and you find some c code even though you (i mean me) never actually learned c, but youre dating someone who HAS spent way more time developing code like this and is the reason why you know that every 64-bit cisc architecture really does want to pretend its a 286 sometimes, but i couldnt prove it to you, and ive compiled (and hacked on) c code that you actually write to an image, which you then write to a usb, which you then boot just like an operating system only because it basically is one.
so thats exactly why i know its not really my thing. plus, ive read about it.
but people who can do that stuff and KEEP doing it are simply amazing.
the operating system has a bunch of stuff to do, like talk to the hardware. it has to manage resources like ram. it has to provide the bridge between the physical machine and the applications, unless youre going to argue (and fine with me if you prefer to) that the firmware should be mentioned here. i have no horse in that race.
i mean, you can also implement an os in uefi, is the thing. the thing about messing with computers is you can run doom on practically anything. like, you can literally take the amiga chip out of a computer, connect the chip socket to the posts of a raspberry pi, and have the pi pretend its the amiga chip and tell the old amiga to Do Stuff and this has actually been done.
but the basics of old hardware include the cpu chip, the ram, storage devices like floppy drives or hard drives- and these would sometimes connect via an "i/o card" but they kept putting more and more of these "cards" onto the motherboard itself.
like you can still connect a "graphics" or "video card" but now theres usually one included on the motherboard. the same goes for a sound card and networking. the "bus" connects the cards together, but now you have "northbridge" and/or "southbridge" controllers to connect things, and im not into hardware enough to keep track of whats old and whats new.
years go by, things change, im typing this on a machine thats somewhere between 10 and 15 years old that i picked up for 80 dollars because i prefer to salvage things. i DID open it, to remove the lcd screen and the tiny cable that goes to it. i dont LIKE to open these things, but i really, really hate it when they make it more tedious to do so. i despise tablets and phones for the same reason. but ive written code for tablets and phones, which (the way i do it) is exactly like writing code for any other machine, except i wont bother trying to run anything unless its for a browser- pure javascript, no libraries, and of course the screen is just "this big" and if you want a keyboard, its typically closer to HALF of that.
i do actually connect an external keyboard to my phone, but only on rare occasions. needless to say, that for me taking the screen out would only result in a phone that didnt do anything. but you CAN learn how to do it properly if thats your thing.
of course im interested in how this stuff works, within reason. as someone on the forum pointed out, a lot of people dont want or need to write an actual operating system- they just want to write a shell.
i mean, now youre talking. the operating system already does more than you probably want to ever think about doing, while the shell lets you tell the operating system to do literally whatever you want. unless you need a more specific application for that.
the operating system is a layer, and as a layer, it has layers. everything in computing is of course, layers. but as i said, theyre layers of abstraction.
linus torvalds (i dont like him, hes everything wrong with capitalism, except compared to whoever will take his place, but has already done so) has said that the operating system doesnt really DO anything, or more accurately that you wont see anything it does. the operating system is sort of everything up to the interactive layer.
as far as i know, i just made up "the interactive layer", thats not a "real" layer. applications are the interactive layer.
but we havent gotten to applications yet.
when the operating system loads up, it starts running other things. as the great linux (this is mostly specific to operating systems running the linux kernel) civil war of about a decade ago demonstrates, this is known as "init" because, its where the computer inits things. im just being obtuse instead of talking about things like systemv, busybox and distros that write their own alternatives. after several years i finally moved to bsd, and to hell with linux init systems.
funnily enough, even a gnome developer can write a REALLY GOOD init... application if you just write reasonable specs for it. i say this because i equate gnome development with all kinds of things i fundamentally dislike about "modern" computing (and i quite like modern computing, in moderation) but really the problem with gnome isnt the developers, its the specs said developers tend to favour. or so it seems to me.
but the init system is the fun part youll probably miss if youre running copland os, because its not graphical.
well, to be fair its text-based mostly, but its usually running on a framebuffer which is graphical. but its sending text to the framebuffer rather than implementing text as graphics itself.
so abstraction wise, you have the hardware and the operating system, the framebuffer (which i think is handled by the operating system kernel but im not a kernel designer as i think ive made terribly clear) and then the init system is PROBABLY doing text on a framebuffer by now. unless youve managed to disable the framebuffer. because thats pretty fun.
and if you dont know, the framebuffer is just what the operating system does graphics (but mostly text) on until a REAL graphics environment is loaded. when xorg (sorry, i meant wayland) isnt running and youre switching between vts, thats typically the framebuffer.
so the init system used to run a thing called getty, where "tty" is the modern (1970s unix) abbreviation for the typewriter i keep mentioning. its short for "teletype". and then you get a login on each tty.
but weve either thrown all that out and redone it, or absorbed it into the master control program, or whatever they do on "linux" these days, at least until its redesigned again- i mean there are two forever-warring camps in computing.
theres the camp that actually makes things work for a quarter century (or more) at a time, first because they can and then because they prefer to, and then because they NO LONGER prefer to, but corporates making them do it.
then theres the camp that pretends the other camp LITERALLY NEVER changes anything JUST BECAUSE (and certainly not because of "facts and logic" but only out of a perverse sentimentality) and has to change literally everything because "its better"(tm).
no extra points for guessing which camp im in. i already said im using bsd. ken thompson is better known for unix (or possibly go) but he actually took a sabbatical from bell to HELP bsd "pirate" his operating system, hes just that cool a guy. i believe he also wrote a basic implementation (as in, the language from dartmouth) once.
but apart from running a login prompt and ultimately vi, ksh or various other things that also work when you login to a unix-compatible/inspired machine remotely, the other thing a vt can be used for is running wayland or xorg or xenocara (which is either xorg or a fork of xorg) which is to say, a much better idea than using the framebuffer.
i still think the framebuffer is kernel-level, and thats not a layer you actually want to do graphics with. yes, i think its AWESOME that some applications (or toolkits) can be compiled to work on the framebuffer. im actually a huge fan of the framebuffer. its just that if you start actually coding for it, i think you too will see that it should never, ever be used (except for the few things its actually used for) and especially it should not be used for graphics. except maybe when youre booting.
we have an entire operating system now, and a great environment (or perhaps, wayland) for running graphical applications (or better yet, countless instances of xterm because im not really wild about multiplexers) but the thing is- we have a SHELL now.
and the shell is the interactive layer that runs the applications.
and now you begin to see the redundancy that develops gradually around a fully-functional computer system:
you have the firmware which is like the os- except it runs the bootloader, which runs the os.
you have the os itself which is like software, it actually is software, except it runs your software.
you have the software that is the init system, except that the init system runs your software.
but actually it just runs the shells, which run the graphical system that runs your shells, which are of course more interactive software that runs your applications, which are your interactive software.
and "messing with computers" means that when you get tired of what one layer is doing, you go THROUGH that layer to get to the next (or previous) layer to mess with THAT instead.
but then if the os has layers before it even runs the applications, then of course the applications also have layers? right.
so operating systems tend to be written in languages that are closer to the machine code the cpu understands, partly because of course and partly because the operating system has to manage ram, so you want to write operating systems using languages that actually let you allocate and deallocate ram. at least im pretty sure you do, because as i said i dont really develop operating systems.
and then it used to be if you wanted friendlier programming languages, you implemented those just like any other programming language, except these days you usually implement programming languages in other programming languages. its not just easier, its more portable to other operating systems.
a library is a bunch of code that you WOULD write in a programming language but dont- because the whole point of the library is SOMEONE ELSE wrote it so you could just write a program that (mostly) uses other libraries, which have already been written.
whats REALLY cool about libraries though, is that not only do they have lots of code thats already written so you dont have to, but they also wrote that code in a different language and then wrote "bindings" so for example, you can be writing python code and most of that python code is just glueing together stuff written in c or c++ (or possibly rust?) and libraries make all this fairly trivial to do.
like, you dont have to know c to use a library written in c, you can just call the functions or "api" the library provides.
which i only mentioned (sort of) because one does not simply write applications, rather one calls libraries provided by toolkits that have implemented a gui on top of x or wayland, using native calls that are decades old but the gui toolkit was just rewritten from scratch in rust a couple months ago.
so operating system -> init -> graphics environment -> toolkits and libraries -> applications
and if that seems like a lot just to edit text, let me tell you that i would never, ever, ever want to write a text editor unless most of it was done already as a library.
which brings me to what SUCKS about libraries, in that you go to all the trouble of learning a library and then implementing an application using said library, and the library goes and folds or forks or is completely rewritten in rust, and now your application depends on something that practically speaking just ceased to exist, but dont worry you can redo everything using a complete new library that no one cared about being compatible with, so just learn an entirely new programming stack so you can reimplement the software you spent years getting just right.
and on a span of 10 or 20 years, some development really is that horrible and painful.
and on a span most developers actually care about, im making an awful lot about nothing.
and eventually you throw your hands in the air and migrate to bsd, because some flavours at least never change anything unless they actually HAVE TO, which hasnt stopped them from literally forking less just so they can PROPERLY implement security at the kernel layer. but its still good, because the fork isnt maintained on github, at least the last time i checked.
so you write some code, it calls some libraries, and its "operating system" (or just software) all the way down to the metal.
are there other layers of abstraction? sure. there are true horrors like message passing (but no one wants to write sophisticated applications without it, so dont let me lead you astray with the notion that theyre pure evil, they just act that way) and then there are developers, which really should count as a layer of abstraction, followed by sponsors and then corporations, then monopolies, then capitalism, the human race, the planet, solar system and galaxies, the cosmos as we know it.
but then possibly there other dimensions, not to mention the mystery of time itself, which i think just to be complete about it ought to loop back to the quantum layer we talked about first. as i said, im not much for mathematics. i will never be really good at physics.
but you sort of get the idea.
now, ill tell you the REAL secret to all this stuff, which is that you can spend 20 years learning all of it or you can spend a couple of years starting to work with it and thinking you have no idea what it is, but actually getting really good experience exploring whatever part of this absurd tangle of technologies appeals to you most.
if you REALLY want to write an operating system, you keep getting closer to the hardware and forget about the stuff at the top.
if you just want to control everything on the computer, you can write an awesome shell that makes you feel like everything on the machine is always at your fingertips. if you want to do lots of multimedia, you dont (normally) implement all that yourself, you write software that calls libraries and hope theyre stable and not a. a weekend project that will be abandoned next year, but youd probably know if it was at that stage of development or worse, b. a mature and STABLE and much-beloved ecosystem, right before microsoft and/or ibm decides to purchase and gut it and sack 75% of the people who made it a reality. so they can add an llm to it, and eventually use it in warfare.
but if ALL computing was this bad, you would only hate it with every fibre of your being, and so would i, and i wouldnt be writing this because id be throwing all my equipment into a bonfire and shouting incoherently, probably while wearing a helmet with horns coming out of each side.
but then no, because HE implemented the curses library.
so thats how layers work.
and if youre still interested in computers after reading that, we can talk about my favourite abstraction:
=> programming-languages.html programming-languages
=> https://fsrhelp.envs.net/
(back to the main page)
=> https://portal.mozz.us/gemini/fsrhelp.envs.net/
(it wouldve been cooler to do it this way instead)
license: 0-clause bsd
```
# 2018, 2019, 2020, 2021, 2022, 2023, 2024, 2025, 2026
#
# Permission to use, copy, modify, and/or distribute this software for any
# purpose with or without fee is hereby granted.
#
# THE SOFTWARE IS PROVIDED "AS IS" AND THE AUTHOR DISCLAIMS ALL WARRANTIES
# WITH REGARD TO THIS SOFTWARE INCLUDING ALL IMPLIED WARRANTIES OF
# MERCHANTABILITY AND FITNESS. IN NO EVENT SHALL THE AUTHOR BE LIABLE FOR
# ANY SPECIAL, DIRECT, INDIRECT, OR CONSEQUENTIAL DAMAGES OR ANY DAMAGES
# WHATSOEVER RESULTING FROM LOSS OF USE, DATA OR PROFITS, WHETHER IN AN
# ACTION OF CONTRACT, NEGLIGENCE OR OTHER TORTIOUS ACTION, ARISING OUT OF
# OR IN CONNECTION WITH THE USE OR PERFORMANCE OF THIS SOFTWARE.
```