Things you should know by now, but only just discovered

Malor wrote:

Hrdina, if this stuff interests you, look up Ben Eater on Youtube. He has a whole series of awesome videos where he's building an actual working 6502 computer on breadboards. It'll never be anything like the 64 was, but damn, it's an interesting set of videos.

Thanks, his videos were also recommended in the comments from The 8-Bit Guy's video, so I'll have a look.

His process sounds similar to what we do at my job: edit & compile on our PC-based systems, then program an EEPROM on our EDU (engineering design unit) SBC (single-board computer) using a JTAG connection. Our EEPROMs are not socketed, so we have to track how many times we write them, at least on our units that will eventually fly in space. We tend not to do too much debugging & repeated image changes on the flight units.

We can also reprogram our units on orbit. This involves sending the image up to the satellite over a very slow connection, where it gets stored into a buffer. Our bootstrap SW has an EEPROM burner within it.

It's true that curry powder is just a spice powder - and that it's a British invention, Indians just use (lots of) the individual spices appropriate for each dish - but there actually is a curry plant; it's what you get curry leaves from, which have a fantastic, addictive flavor. Periodically I get an incredibly strong urge to have dishes with curry leaves in them, and satisfying that urge has never steered me wrong. I can't really describe the flavor, as it's very distinct.

They're a little leathery, so usually they're used to flavor a dish by being thrown in with the spices that bloom in oil before the onions, ginger, and garlic go in, and if you're serving to people not used to them you'll fish them out before serving to avoid someone trying to eat them. They're not inedible, but by the time the dish is done they've done their job.

Here's a bunch of recipes that involve curry leaves - the curried avocado is one of my mainstays, and winds up like a chunky, curried version of guacamole. I can't vouch for the others (the cabbage with fennel seeds is also a mainstay, but I've never made it with curry leaves, I do it straight and serve it with sausage as though it's an Indian version of sauerkraut).

Hot To Use Up Fresh Curry Leaves

Hrdina wrote:

His process sounds similar to what we do at my job: edit & compile on our PC-based systems, then program an EEPROM on our EDU (engineering design unit) SBC (single-board computer) using a JTAG connection. Our EEPROMs are not socketed, so we have to track how many times we write them, at least on our units that will eventually fly in space. We tend not to do too much debugging & repeated image changes on the flight units.

We can also reprogram our units on orbit. This involves sending the image up to the satellite over a very slow connection, where it gets stored into a buffer. Our bootstrap SW has an EEPROM burner within it.

That sounds real neat to someone who compiles webdev bullcrap daily.

I've got a friend whose few train-level interests are an intersection of all what you said above, if you include lunch. So if you don't feel like you're living THE dream, I think you're living somebodies.

Hrdina wrote:

These days I make my living writing software for spacecraft, aircraft, and other such things. However, like many other SW engineers from my generation, I got my start when I was gifted an 8-bit computer by a parent. In my case this was a Commodore 64, which is still secured in my attic because I can't bring myself to recycle it.

I spent more time than I care to admit just playing with that, coding it in BASIC, and acquired some reference manuals that helped me learn more. I knew that some of the fancy commercial games I played were created using something called machine language, but never had that bit of insight or small push from an outside source to learn now to do that myself.

I never heard of a compiler until I went to a summer program after my junior year in HS (NJ Governor's School in the Sciences) and that was actually just compiled version of BASIC. My first CS class in college was the first time I had ever seen a programming language without line numbers (Pascal). Over the following few years my horizons expanded a lot, but I never did take the time to go back and apply any of that to my first programming love, the C64.

So, this video from The 8-Bit Guy filled me with a mix of nostalgia, regret, and a feeling of "I should have known this by now".

I took a semester of Assembly in college. That's as close to machine language as I need to get.

Quintin_Stone wrote:

I took a semester of Assembly in college. That's as close to machine language as I need to get. :D

Yeah, same here. It definitely gave me an appreciation for all of the niceties that interpreted and compiled languages (which I was already pretty familiar with) brought.

Baby Oil isn't made from real babies!

merphle wrote:
Quintin_Stone wrote:

I took a semester of Assembly in college. That's as close to machine language as I need to get. :D

Yeah, same here. It definitely gave me an appreciation for all of the niceties that interpreted and compiled languages (which I was already pretty familiar with) brought.

I took an assembly language class back in the day. It was difficult. Everyone did real bad on the tests.

Quintin_Stone wrote:

I took a semester of Assembly in college. That's as close to machine language as I need to get. :D

That's almost the same thing; assemblers mostly just let you name entry points and variable locations, and move things around easily. Nobody sane works in actual machine code unless they have to.

Danjo Olivaw wrote:

That sounds real neat to someone who compiles webdev bullcrap daily.

I've got a friend whose few train-level interests are an intersection of all what you said above, if you include lunch. So if you don't feel like you're living THE dream, I think you're living somebodies. :D

When I was a kid, I was sure I was going to be an astronaut. Getting to work in the space industry is as close as I'm going to get to that, so I really enjoy what I do.

I could probably have retired by now if I had left aerospace for wall street back in the 90s like a handful of my co-workers did. While I would have liked to have retired by now, I'm not sad that I didn't trade industries.

tuffalobuffalo wrote:
merphle wrote:
Quintin_Stone wrote:

I took a semester of Assembly in college. That's as close to machine language as I need to get. :D

Yeah, same here. It definitely gave me an appreciation for all of the niceties that interpreted and compiled languages (which I was already pretty familiar with) brought.

I took an assembly language class back in the day. It was difficult. Everyone did real bad on the tests.

For us, that nasty class was Compilers. I really liked the Assembly class because it was taught as a mechanism to explain computer architecture.

Malor wrote:
Quintin_Stone wrote:

I took a semester of Assembly in college. That's as close to machine language as I need to get. :D

That's almost the same thing; assemblers mostly just let you name entry points and variable locations, and move things around easily. Nobody sane works in actual machine code unless they have to.

I have had a couple of occasions where I've needed to look at the assembly & machine code output as a debugging exercise. I've only really needed the ML part when I had to pull up the PowerPC manuals to interpret the compiler output. I'm not as fluent as I should be with PPC Assembly, so I used the ML opcodes to make sure I was reading the Assembly correctly.

Mixolyde wrote:

Baby Oil isn't made from real babies!

You have to read the label to make sure you're getting the authentic stuff.

Hrdina wrote:

When I was a kid, I was sure I was going to be an astronaut. Getting to work in the space industry is as close as I'm going to get to that, so I really enjoy what I do.

When I was 6, I decided I was gonna build airplanes.

38 years later, that's all I've done, and my takeaway is that you shouldn't fix your career at the age of 6.

Quintin_Stone wrote:

I took a semester of Assembly in college. That's as close to machine language as I need to get. :D

I did the same but in high school, and then pretty much never touched it again until last week - when I found myself trudging through assembly code compiled from JavaScript, of all things

(edit: no comments about JS being trash please)

I messed around with 65C02 assembler back in the Apple IIe days. I'd already come into programming and architecture via formal logic and other languages, so it just kind of cemented my understanding of the interface between hardware and software of the time. Which is useful, but honestly, spending 20 minutes to make the screen blink through 16 colors was, you know, painful. I could make a little guy walk across the screen too. But I quickly figured out my interests lay more in how the parts of the system interacted than in programming, so I became a systems admin.

Some assembler play, though, is healthy for everyone's understanding of how computers work. Even if it's just through Zachtronics games.

Robear wrote:

Some assembler play, though, is healthy for everyone's understanding of how computers work. Even if it's just through Zachtronics games. :-)

Heh. Was just gonna say that the assembler course I did in college has only really gotten used in the context of Zachtronics.

Hrdina wrote:

These days I make my living writing software for spacecraft, aircraft, and other such things. However, like many other SW engineers from my generation, I got my start when I was gifted an 8-bit computer by a parent. In my case this was a Commodore 64, which is still secured in my attic because I can't bring myself to recycle it.

I spent more time than I care to admit just playing with that, coding it in BASIC, and acquired some reference manuals that helped me learn more. I knew that some of the fancy commercial games I played were created using something called machine language, but never had that bit of insight or small push from an outside source to learn now to do that myself.

I never heard of a compiler until I went to a summer program after my junior year in HS (NJ Governor's School in the Sciences) and that was actually just compiled version of BASIC. My first CS class in college was the first time I had ever seen a programming language without line numbers (Pascal). Over the following few years my horizons expanded a lot, but I never did take the time to go back and apply any of that to my first programming love, the C64.

So, this video from The 8-Bit Guy filled me with a mix of nostalgia, regret, and a feeling of "I should have known this by now".

I learned Assembly language on my Commodore (SX-64)! It had a built in assembler and there were helpful programs like TurboAssembler that made things a lot easier. 8-Bit Guy describes machine language perfectly. Having every bit of hardware treated like a memory address makes things seem like magic. You get a map of the computer you're working on, and you tell everything what to do.

I also went the Pascal route in school. It seemed like doing things the hard way (at the time) since I had already learned to code in a combo of Basic/Assembly. It wasn't until I found C/C++ in college that I found another language I liked (Fortran can also suck it).

I learned assembly language by being thrown into the deep end when I got my first real programming job. I worked on (parts of) the Atari ST BIOS.

Fortunately they gave me a month or so to get up to speed, which mostly consisted of reading a Motorola 68000 manual and BIOS code. I already had a basic foundation in programming, so I wasn't an utter neophyte.

The other thing that helped was that I was often assigned the task of chasing down bugs. I knew C, I knew how a compiler worked (thanks to a previous mentor who was a compiler designer), so I got to work on it first-hand because the only debugger we had at the time was a home-grown assembler level debugger.

I don't think I ever worked on actual machine language, as in the binary output of an assembler program, but there were chip designers in the department who did.

BadKen wrote:

I learned assembly language by being thrown into the deep end when I got my first real programming job. I worked on (parts of) the Atari ST BIOS.

Fortunately they gave me a month or so to get up to speed, which mostly consisted of reading a Motorola 68000 manual and BIOS code. I already had a basic foundation in programming, so I wasn't an utter neophyte.

The other thing that helped was that I was often assigned the task of chasing down bugs. I knew C, I knew how a compiler worked (thanks to a previous mentor who was a compiler designer), so I got to work on it first-hand because the only debugger we had at the time was a home-grown assembler level debugger.

I don't think I ever worked on actual machine language, as in the binary output of an assembler program, but there were chip designers in the department who did.

IMAGE(https://i.ytimg.com/vi/1_VRmgjoa5A/maxresdefault.jpg)

Daddy?

Funny story: I am responsible for that shimmering rainbow in that version of TOS. I added it just before we were burning EPROMS to send off to manufacturing (I was also one of the 2 person release team), and Leonard Tramiel thought it was cool enough to leave in.

I got the idea after talking with Dave Staugas who was an old school Atari video game developer and worked on graphics for the ST. He told me how he used to cheat to get extra colors on the screen, so I used the same technique so that even in 16 color mode (not 16 bits... 16 colors), the rainbow worked. It involved setting up an interrupt that was just long enough after the HBLANK interrupt, and writing an interrupt handler that stuffed and unstuffed the color registers. That interrupt handler had to execute in just the amount of time required for the CRT beam to draw the icon. I literally counted CPU clock cycles for instruction execution.

Of course that would never work on a computer with an upgradable CPU. Back in the day we did not have nanosecond clocks. It also probably doesn't work on emulators.

Found it:

Is "GEM" pronounced like "GIF"?

merphle wrote:

Is "GEM" pronounced like "GIF"?

I'm just saying, there would be no confusion if they spelled it like the lead singer of The Holograms, and I am outraged that they didn't. Truly, truly, truly outraged.

BadKen wrote:

Funny story: I am responsible for that shimmering rainbow in that version of TOS.

I know someone famous!

As an original Amiga owner I think it's pretty cool to hear someone who worked on TOS referring to someone else as "old school"

Unrelated, but my TIL is that apparently Japanese had no gendered pronouns until the 19th century, when a new term for "she" was adopted for use when translating from other languages.

fenomas wrote:

As an original Amiga owner I think it's pretty cool to hear someone who worked on TOS referring to someone else as "old school"

Yeah, well Staugas wrote games for every Atari console, so that’s pretty old school. To get any old schooler you’d have to have worked on creating a mini or mainframe OS (UNIX was 1971). 1977 was the year of the Atari VCS (2600), Apple II, and Commodore PET. I was a sophomore in high school, connecting to a time share system via a printer terminal at school. Before that it was kit computers or major appliance sized boxes that required their own HVAC.

When I started college, my desktop computer was an Atari ST.

Quintin_Stone wrote:

When I started college, my desktop computer was an Atari ST.

You started after I did, didn't you? (I was fall '91.) The ST might've been a bit old but having any PC around then was better than none. My college PC was a 386SX/20. Even it was a generation behind. Don't think I've ever had a current gen machine since the Apple IIc.

Understatement:

LouZiffer wrote:

a 386SX/20. Even it was a generation behind

Wasn’t nearly in college by then myself but we tried to stay just one generation behind, which meant a 386 a few years before that.

Edit: hmmmm.....seems like the generations were crazy long back then. There wasn’t anything between 486s (1989) and first pentiums (1993)?

My first college desktop in '97 was an AMD K6-233, and it ran the hell out of Fallout 1 and Curse of Monkey Island

LouZiffer wrote:
Quintin_Stone wrote:

When I started college, my desktop computer was an Atari ST.

You started after I did, didn't you? (I was fall '91.) The ST might've been a bit old but having any PC around then was better than none. My college PC was a 386SX/20. Even it was a generation behind. Don't think I've ever had a current gen machine since the Apple IIc.

Yeah, by a year. I could play Dungeon Master on my ST, not so much on a 386.

Back when performance doubled or more between gens, but jeez the prices on stuff back then. Manufacturers have gotten good at selling stuff sliced extra thin and cheap(er).

Tanglebones wrote:

My first college desktop in '97 was an AMD K6-233, and it ran the hell out of Fallout 1 and Curse of Monkey Island

My first (and only) college desktop was an HP 10C calculator. I think you could program it to play a single-player submarine version of Battleship.

Is the slang “borked” based on a mis-spelling of “broke” or “broken?” (I always assumed it was a polite way of saying “f*cked.” See also, “Jacked Up.”)

If it is, I only just realized today when I typed that something was “borken.”