The Joys Of Programming

It really comes down to what you are trying to do. With a laptop, especially a MacBook that have a reputation of CPU throttling, if you want to do anything demanding like running a bunch of docker containers (as you mentioned) it will perform at a fraction of what a desktop can do. It also has more spec limitations. If you are doing really light programming tasks, then it probably is a good choice.

My desktop is 6 years old now and it still runs the same or better then most brand new laptops. It is a dual quad-core Xeon (16 logical processors) with 32GB RAM. I have 6 monitors connected to it. I am starting to look at getting a new desktop which I plan to have at least 16 cores and 128GB of RAM, which would cost around the same as a top of the line MacBook.

I use a new MBP for work, which recently replaced a 2017 model. I'm not really an "Apple fan" (this is the only Apple device I use), but it is a reasonably nice machine.

The good: USB-C. The one thing Apple did right is go all-in on USB-C, which has probably pushed adoption ahead years faster than it otherwise would have. We should be demanding more USB-C buy-in from everyone else. Of course we should want a reversable, power-delivering expansion port that can handle everything.

I have a single USB-C cable running from my closed-lid laptop to a monitor, pushing a 3440x1440p @ 100hz video signal to it, while the monitor sends back power to the laptop over the same cable. My laptop charger remains in my travel bag. It's not only the best docking solution I've ever had, but the best docking solution I've ever seen anyone else have. "USB-C" may only refer to the physical port, but in practical terms, USB-C means USB-PD and DisplayPort video over USB, and pushing people towards using the port required for USB 3.2 and beyond. It makes complaints about having to use a little hub for plugging in USB-A devices look silly IMO. I honestly don't know how anyone looks at a bunch of legacy one-function ports (this one's for charging, this one is for HDMI video out, and f**k you if you want them arranged differently) as better.

The bad: the keyboard, still. Compared to my 2017 MBP, it's better. By "better", I mean when I hit the Z key, I get "z" instead of "zzzz", so that's nice. But I do my daily work on a real keyboard, and only use the built-in keyboard when traveling. I would happily add a few millimeters of thickness to the entire laptop, and devote that entire extra width to some god damn mother f**king key travel.

That's the thing that drives me nuts about the keyboard. I can deal with dongle hell and the other things, but the lack of travel is awful. It's an ergonomic nightmare and a completely unforced error. I'm worried they'll never fix it because Apple never admits mistakes.

DSGamer wrote:

I'm worried they'll never fix it because Apple never admits mistakes.

I'm worried that it's eventually going to be a touchscreen virtual keyboard.

It already almost feels like mashing against a solid screen, and the "touch bar" seems like a baby step to making the whole thing a touch display.

*Legion* wrote:
DSGamer wrote:

I'm worried they'll never fix it because Apple never admits mistakes.

I'm worried that it's eventually going to be a touchscreen virtual keyboard.

It already almost feels like mashing against a solid screen, and the "touch bar" seems like a baby step to making the whole thing a touch display.

Ew. Wouldn't that be unusable for most people? I need to feel the keys to know where my hands are. Also I consider that most modern phones don't have a touch keyboard as further evidence that the species should be wiped for a do over. See what evolves from ravens maybe.

Danjo Olivaw wrote:
*Legion* wrote:
DSGamer wrote:

I'm worried they'll never fix it because Apple never admits mistakes.

I'm worried that it's eventually going to be a touchscreen virtual keyboard.

It already almost feels like mashing against a solid screen, and the "touch bar" seems like a baby step to making the whole thing a touch display.

Ew. Wouldn't that be unusable for most people? I need to feel the keys to know where my hands are. Also I consider that most modern phones don't have a touch keyboard as further evidence that the species should be wiped for a do over. See what evolves from ravens maybe.

"You just sorta mash something in the ballpark and let autocorrect sort it out. Silly old person."

*Legion* wrote:

"You just sorta mash something in the ballpark and let autocorrect sort it out. Silly old person."

I think the future of keyboards will actually be not far from that. I had an idea a while back (2010 maybe) of a keyboard that is just a touch screen panel where you would put your hands in the default position and then when you type, the computer learns your inaccuracies and moves the keys to compensate.

Touch screen keypads are the future. Just learn to type like Data.

*Legion* wrote:
DSGamer wrote:

I'm worried they'll never fix it because Apple never admits mistakes.

I'm worried that it's eventually going to be a touchscreen virtual keyboard.

I just shuddered.

-BEP

I lucked out and bought a 2015 MBP back in the day, and lately I always hear that being described as the last good model before touch bars and keyboard problems became the trend. It's still trucking, and since I do front end work without much need for CPU I reckon I'll probably replace the battery in a year or two, rather than upgrading. (I run the GPU pretty hard, but it copes and I find it useful to game-develop on a lowish-spec machine so that performance changes don't go unnoticed..)

OTOH, in the last year or two I see a lot of posts (on HN etc) from people who are happy with having moved from a macbook to this or that PC ultrabook, either staying with win10 or using ubuntu.

DSGamer wrote:

Touch screen keypads are the future. Just learn to type like Data.

No, no, no. It'll all be voice commands.

The thing about Star Trek voice passwords is that they clearly have the additional security of looking for the voice pattern in addition to the content, so they have multi-factor authentication by default.

You do have to change them after every time you use them, so that's a definite drawback.

hi my name is Werner Brandes, my voice is my passport, verify me

*Legion* wrote:

hi my name is Werner Brandes, my voice is my passport, verify me

Mary McDonnell saying "Passport" has always done it for me.

-BEP

bepnewt wrote:
*Legion* wrote:

hi my name is Werner Brandes, my voice is my passport, verify me

Mary McDonnell saying "Passport" has always done it for me.

-BEP

Mmm.

My bank's phone system asks me to say "My voice confirms my identity" whenever I call them.

*Legion* wrote:

hi my name is Werner Brandes, my voice is my passport, verify me

Holds up really well.

Mixolyde wrote:
*Legion* wrote:

hi my name is Werner Brandes, my voice is my passport, verify me

Holds up really well.

The only thing that doesn't hold up is the '90s hairstyles.

I watched some clips on YouTube yesterday thanks to these comments and thought it was funny how much of Ben Kingsley's dialogue could be copy and pasted right into Mr. Robot season 1. Cosmo was F-Society over 20 years early.

Is Sneakers effectively cyberpunk?

I always appreciated that they gave a little wink at why David Strathairn's character is called Whistler but didn't fully explain the reference.

Huh, I wonder if people who've only had the thin crappy keyboards will like normal ones? I liked the old butterfly switches on their laptops from the 2009 era, those were super comfortable, but the modern ones sound really unpleasant. Will people preferentially use crappy keyboards because that's what they learned?

And, yes, the thing about Apple never, ever admitting a mistake is maddening.

IMAGE(https://i.redd.it/alhboe19j1431.jpg)

Feeling this today.

https://tonsky.me/blog/disenchantment/

Your desktop todo app is probably written in Electron and thus has userland driver for Xbox 360 controller in it, can render 3d graphics and play audio and take photos with your web camera.
The whole webpage/SQL database architecture is built on a premise (hope, even) that nobody will touch your data while you look at the rendered webpage.
A DOS program can be made to run unmodified on pretty much any computer made since the 80s. A JavaScript app might break with tomorrow’s Chrome update

It seems like nearly everything in that rant is either just outright wrong, or a demand that some mythical 'they' fix some vaguely-defined problem.

A lot of the rant is silly, but this part is definitely accurate:

A simple text chat is notorious for its load speed and memory consumption. Yes, you really have to count Slack in as a resource-heavy application. I mean, chatroom and barebones text editor, those are supposed to be two of the less demanding apps in the whole world. Welcome to 2018.

Slack is a web browser posing as a chat app. It's sh*t, but it's also free.

absurddoctor wrote:

It seems like nearly everything in that rant is either just outright wrong, or a demand that some mythical 'they' fix some vaguely-defined problem.

Have to agree. It reads like a bunch of variations on "Android apps are really huge these days. Why are they so big? I have no idea. But they should be smaller."

It reminds me of Chesterton's fence. "There are a lot more fences than there used to be. Why doesn't someone get rid of all these fences?"

Opening Inform

Really interesting presentation from Graham Nelson, creator of the Inform language used for interactive fiction, on the big-picture process of maintaining and open-sourcing his language.

I find the "good ol' days" arguments about software and development uncompelling.

Even in circles like Vim users, you get the people who are ardently against plugins, because then Vim might open in 75 milliseconds instead of 25 milliseconds and might consume *gasp* multiple megabytes of memory.

Some people aren't happy unless their CPU is completely idle and their memory totally unallocated.

There is an in-between, though, right? I get that “your cellphone should boot in 1 second” is hyperbolic. But I regularly have Chrome and Electron-based apps using up 12+ GB of memory.

I think the point of the rant (and it’s definitely a rant) is that we definitely take advantage of the fact that compute time is cheaper than man time.