The Joys Of Programming

Feeling kind of trapped right now. When I came back from Australia I was really rusty on my main language, Java. And I was definitely rusty on current practices for doing microservice development in the cloud, which is the main usage of Java. I'd been doing ad agency work for the last 3 years. 2 years in Portland and then a year in Australia. After my mom passed away I fell into the arms of the first job I could because I just wanted to get back into work. It was another ad agency that promised to let me roll off of full time work with AEM (what they needed a contractor for).

14 months later I'm still programming in AEM, which is inscrutable, poorly documented and the API changes on a whim with little to no warning. I'd like to go back to mainline development, but now I'm almost 5 years out of the last time I did full time application development in Java (as opposed to CMSes, websites, etc.).

Trying to think of how to proceed. AEM's getting old.

DSGamer, I empathise. I've definitely been stuck in jobs where I was left on the maintenance side of the company and could feel everything atrophy. It's generally a case of a change is as good as a rest.

Get yourself back up to speed with whatever is the state of the art in your chosen technology. Spend a couple of minutes each day on the subreddit for whatever it is. There must be a conference or two where people are presenting the new ideas, all the majors now have the last few years up on YouTube. I'm not in JVM land but I'm sure someone here is and could point you in the right direction.

Then once you're chomping at the bit to work with something, start looking around for a local shop that is doing that. There are an awful lot of weak candidates in the programming world. Unless you are looking for a job at one of the giant places that everyone applies for, I guarantee you that they aren't seeing enough skillful people that they are turning away good people.

DoveBrown wrote:

Hmm strange using Spectre and Meltdown to complain about the C abstract machine. Those defects are only defects when they break the sandbox.

Is the broader argument not more along the lines of 'the sandbox wouldn't have to be as complicated and breakable if we weren't having to support an abstract machine that is wildly out of date'?

And I am, by no means, educated enough in the particulars of C, C compilers or hardware design to comment.

DanB wrote:
DoveBrown wrote:

Hmm strange using Spectre and Meltdown to complain about the C abstract machine. Those defects are only defects when they break the sandbox.

Is the broader argument not more along the lines of 'the sandbox wouldn't have to be as complicated and breakable if we weren't having to support an abstract machine that is wildly out of date'?

And I am, by no means, educated enough in the particulars of C, C compilers or hardware design to comment.

I'm not super educated on this either so apologies if this wrong.

Spectre and Meltdown are side channel attacks on the memory protection system via speculative execution in modern cpus. Some memory protection system is required by all multitasking OS. Multitasking here in the sense of having more than one process running at a time. Providing a Memory Management Unit of some sort in hardware isn't tied to the C Abstract Machine. If you want to isolate processes from each other in memory you require something.

The speculative execution also doesn't really have much to do with C. Pipelining of instructions exists mostly because the silicon can't do a lot of operations in a single clock cycle and not all silicon is active in each operation. The chain of physical gates to do the operation might take more than a single tick. Then if you look on the physical chip, the gates that are being used for doing a float point operation, are different from the gates that move values between registers. So you could start setting up the registers for the next operation while the current floating point operation goes through. The more operations you can have in flight through the silicon the faster your throughput will be. Which brings us to the speculative instruction, since you have more silicon that could being used while you wait to know what your branch is, we could use that idle silicon and guess the branch result. More operations in flight means, if you got short cut some cycles, if you don't you're no worse off than if you had waited. Again this isn't something specific to C derived languages.

TLDR: While I'm sure there are some artifacts of the C Abstract Machine leaking into physical chips, the bugs have more to do with the layout of silicon and the fact that hardware engineers HATE having idle gates.

DoveBrown wrote:

Again this isn't something specific to C derived languages.

No but again I think the argument isn't so much that C causes all this so much that C is just the current most popular language built against a PDP-10-like abstract machine. And supporting this PDP-10 abstraction is a source of hardware and compiler complexity with attendent errors we could do away with.

All that being said it is very hard to imagine what it would be like to program in a modern language that "more truthfully" exposes the operations of modern CPUs would be like. I don't particularly want to have to reason about the interaction of 180 parallel executing units. What little gpGPU stuff I've been exposed to didn't exactly thrill me but I have quite enjoyed the bits of erlang I've tinkered with.

I'm not sure what the abstraction is that C is hoisting on the hardware designers?

That cache's are complex isn't because of the flat memory model, it's because we don't want to add restrictions on what you can use any bit of the memory for. Making all memory accessible from any of the computing units (GPU, CPU) was a big win for the XBox360 over the heterogeneous architecture of the PS3. The hardware designer doesn't know what the machine is going to be used for, so they have to design for flexiblity.

I'm not sure we could make cache coherency easier by saying that some memory is mutable but thread local and some memory is immutable. But if we want threads to communicate then we're still going to need some way to convert mutable memory thread local memory to system visible immutable memory. Then that system visible immutable memory is still going to need cache invalidation methods, so I don't think you're going to save huge amounts of silicon in the cache coherency hardware. Programming languages like Rust and Erlang add the abstraction that memory is immutable or not. Maybe that abstraction can be supported in hardware but I don't see how it reduces the complexity under the hood.

It's worth at this point pointing out what the "Flat Memory Model" means. My view it means that memory is a homogeneous memory block which can be addressed in sequence. It's really really nice, having different memory for the GPU and CPU on modern PCs is a pain. Having lots of different memory pools would be much worse and would probably mostly just get off loaded to the compiler. Heterogeneous computing is one of those things that's great for specialised cases but when the hardware designers aren't in the same building as the programmers it really does suck.

PyCon 2018 was in Ohio this year, Cleveland, which is just a few hours away. I'd really been thinking of attending.

And then forgot.

Guess I'll just watch some (more) videos.

Anyone here use a MacBook for programming? If so do you use a newer one? I’m considering upgrading my computer, but I’m highly suspicious of the keyboard.

Lenovo carbon X1 my dude.

DSGamer wrote:

Anyone here use a MacBook for programming? If so do you use a newer one? I’m considering upgrading my computer, but I’m highly suspicious of the keyboard.

I have one. The keyboard is garbage. I only use it when traveling for work. For daily work, I have a nice mechanical keyboard plugged into it.

boogle wrote:

Lenovo carbon X1 my dude.

Yeah, we use Macs at my work so it's either old Mac or new Mac. For home I'll definitely be reconsidering Apple in the future because of what they've done to their hardware.

*Legion* wrote:
DSGamer wrote:

Anyone here use a MacBook for programming? If so do you use a newer one? I’m considering upgrading my computer, but I’m highly suspicious of the keyboard.

I have one. The keyboard is garbage. I only use it when traveling for work. For daily work, I have a nice mechanical keyboard plugged into it.

That's my first impression as well. It's telling that even though I dislike the Apple keyboard circa 2014, I'm considering taking a 2014 external keyboard with me in my backpack if I decide to start using a newer MacBook.

I recently had to configure a new MBP for an coworker and spent some time with it and wow that's a bad keyboard. I've got a 2013 MBP and the keyboard is fine, I guess. I spend most my time with the laptop closed and using external peripherals (Microsoft Ergonomic keyboard, etc), but given the choice between the modern keyboard and one from 5 years ago, it's not even close.

Yeah, that’s how I work as well. Split ergo keyboard with the laptop lid closed. But I’ve worked hundreds of hours on that Mac keyboard (2013 MBPr) without any complaints.

The current gen MBP keyboards follow the recent Apple keyboard trends that trade key travel for overall stiffness. The current desktop keyboards have followed a similar trend, but currently still have more travel than the laptops.

I am ambivalent about the keyboard. My mind hates typing on it, but I think I actually type faster. ¯\_(ツ)_/¯

After a bit of acclimation time I don't find it that bad to use. I still like the previous gen better.

What I really wish is that someone would make a laptop keyboard where the keys are some color that won't show every single micro-liter of finger grease that I might put on the keys by accident. This is a complaint I have about all of the Apple laptops. Their sleek looking black keys look good for about two weeks, and then are trash forever.

DSGamer wrote:

Anyone here use a MacBook for programming? If so do you use a newer one? I’m considering upgrading my computer, but I’m highly suspicious of the keyboard.

I have a "13-inch, 2017, Four Thunderbolt 3 Ports". The keyboard actually isn't all that bad for typing on once you get used to it, provided you work in a Howard Hughes style clean room. The slightest bit of dust and the keys stick or feel mushy. Also, a colleague has worn through the "S" key in less than 6 months on his.

*Legion* wrote:

For daily work, I have a nice mechanical keyboard plugged into it.

I mean same. Getting a nice setup with my monitors and a kb/m switch really helped.

Please to be linking buy link for USB kb/m switch that actually works 100% of the time and fools each machine into thinking that a device is always plugged in, even when you're switched off of it.

Oh it does not fool each machine, my mouse, kb, and usb mic disconn when I switch to the other machine.

My big pet peeve at the moment is I set my touchpad off manually using synclient when I'm working on my monitor since the laptop is closed, but my login manager light-locker always flips it back on so when my login screen is up the mouse jumps around like a lil bug.

Zelos wrote:
DSGamer wrote:

Anyone here use a MacBook for programming? If so do you use a newer one? I’m considering upgrading my computer, but I’m highly suspicious of the keyboard.

I have a "13-inch, 2017, Four Thunderbolt 3 Ports". The keyboard actually isn't all that bad for typing on once you get used to it, provided you work in a Howard Hughes style clean room. The slightest bit of dust and the keys stick or feel mushy. Also, a colleague has worn through the "S" key in less than 6 months on his.

The 2017 versions have slight changes from the 2016 version keyboards.

boogle wrote:

Lenovo carbon X1 my dude.

After my work Mac died I bought an HP Spectre because I thought I could get away with a non-Mac ultrabook (it just runs Fedora). And it is a beautiful machine. But the keyboard is all wrong and the touchpad, while better than Dell's doesn't have a good enough thunk to it. I don't like the haptic touchpads on the new Macs, but the current ThinkPad touchpad has a better feel to it even than the prior king of laptops, the 2015 15" Retina MacBook Pro. The ThinkPad touchpad is predictable Synaptics junk under Windows, because it's not a Precision driver touchpad, but it's great under Linux.

I tried an X1 Carbon because they have one on sale at Costco, and was like "oh, damn, this keyboard's pretty good, and that touchpad", and I'm returning the Spectre because of it. But instead I got an T480 with the 1440p screen and a built-in gumstick drive; it's got another gumstick slot and a 2.5" bay and it's still super thin and manageable. The X1 is really nice hardware but it's got worse heat management than the T480. It's also more expensive and unmaintainable compared to the T480 or even the T480s (which might be a good compromise). It's also got a bigger footprint than the X280--which I really considered as well, because I had an X220 once upon a time and I love that form factor but even in 12.5" 1080p is not enough.

The X1 looks really cool, don't get me wrong, but unless you're buying based on that I think a slightly heavier laptop that gives you over twice the battery life (separate but related: Lenovo's Linux-specific power management stuff is wild) is a good call for most developers.

EDIT:

DSGamer wrote:
boogle wrote:

Lenovo carbon X1 my dude.

Yeah, we use Macs at my work so it's either old Mac or new Mac. For home I'll definitely be reconsidering Apple in the future because of what they've done to their hardware.

Unless you're using Xcode or some weird stuff like Coda or Brackets, you can totally use Fedora. Probably learn a good bit, too.

If you are using Xcode--well, my sympathies.

Ed Ropple wrote:
DSGamer wrote:
boogle wrote:

Lenovo carbon X1 my dude.

Yeah, we use Macs at my work so it's either old Mac or new Mac. For home I'll definitely be reconsidering Apple in the future because of what they've done to their hardware.

Unless you're using Xcode or some weird stuff like Coda or Brackets, you can totally use Fedora. Probably learn a good bit, too.

If you are using Xcode--well, my sympathies.

I'm not using Xcode. I actually ran only Linux from like 1996 - 2010. It was really the introduction of the modern Mac and especially the MacBook Pro line that moved me over to Mac. I would hate to make the switch back at this point, but I really don't like what they've done to their hardware and I think I could make peace with running a Linux distro again as there isn't much I do with a computer these days that's terribly hardware specific.

In the early 2000s when support for WiFi drivers under Linux was pretty iffy and power management was even worse, a switch to Mac was a welcome change. The hardware was awesome and the operating system just worked. Both of those things have changed, though. It makes me a bit sad, to be honest.

Thankfully my 2013 MacBook Pro is still going strong. I'll just run that into the ground. The real question is what to do at work.

Fedora's legitimately fantastic these days, FWIW. GNOME 3 is great, dev tools work out of the box, and Docker is no longer a (virtualizing) battery suck. My last development Mac got watered; I bought a new top case (because Apple wouldn't replace it, 'cause liquid) secondhand and will be building it back up as a backup machine, but I am pretty done with them as daily drivers.

I recalled you in the past saying you'd used Linux, but things have changed significantly since 2010 (which was the last time I'd run it as a desktop either), hence the learn-a-good-bit comment.

I dunno man I bought it like 2.5 years ago

Ed Ropple wrote:

Fedora's legitimately fantastic these days, FWIW. GNOME 3 is great, dev tools work out of the box, and Docker is no longer a (virtualizing) battery suck. My last development Mac got watered; I bought a new top case (because Apple wouldn't replace it, 'cause liquid) secondhand and will be building it back up as a backup machine, but I am pretty done with them as daily drivers.

I recalled you in the past saying you'd used Linux, but things have changed significantly since 2010 (which was the last time I'd run it as a desktop either), hence the learn-a-good-bit comment.

Good point. I think the last time I used Linux frequently was early CentOS 5. I’m sure Linux subsystems have changed quite a bit since then.

We are running Centos7 at work. I have quite a highspec machine and Centos 7 has been fine (good even) except the following hilarity:

Anything before and including centos7.5 was a complete ball ache to configure for 4K monitors. Centos7.6 with gnome 3.25 has somewhat fixed that but like all versions of gnome the settings are scattered to the four winds. But you can at least make everything look nice and scaled correctly with somewhat minimal settings changes (unlike before where you had to manually edit all sorts of files)

We found that the upgrade from 7.5 to 7.6 broke several machines. I vaguely recall that change rolled in the spectre/meltdown kernel fixes and it just broke lots of stuff. In the end we just had to do a clean 7.6 installs.

If you're planning to switch to linux X remains a complete trashfire of horror.

If you're planning to switch to linux X remains a complete trashfire of horror.

That sounds about right. Note that you typically can beat it into shape, and if you have a fairly mainstream-ish system (1920x1200 monitor, mostly), with motherboard sound and either Intel or NVidia graphics, it's usually pretty straightforward. But once you're off the straight and narrow, you can be pretty deep in the weeds pretty quickly.

You can straighten things out eventually, but it's a pain in the butt. As an example from the most recent Ubuntu, if I want to raise the font sizes on my 2560x1600 screen, that requires me to install the "gnome-tweak-tool" package, which comes separately from GNOME itself. In that tool, I can change the default font sizes, but nowhere in the main system. And then setting up multiple monitors was very painful... I do this oddball thing where I have a "dead screen" that has no video connected to it. It's actually a Denon receiver, and I use this to drive HDMI sound output. It makes for flawless bitperfect and really easy handling of all the various Dolby and DTS high bitrate formats (at least in WIndows, I haven't tried those with Linux), but both X and Windows won't turn on a screen without forcing you to use it. So I stick the Denon off the bottom-right of the main screen, and if something ends up there on Windows, I can move it back with windows-left-arrow a couple times until it's back on the main screen. On Linux, I have no idea how to move it, but fortunately it's been good about not putting anything down there.

I'm not sure why this 'dead screen' setup doesn't happen more often, but I guess it's rare enough that even Windows has trouble with it. Getting it set up with Windows is easy, but it gives me a little ongoing problem. Getting it set up with Linux was quite painful, but then hasn't bothered me at all once it was running.

DanB wrote:

We are running Centos7 at work. I have quite a highspec machine and Centos 7 has been fine (good even) except the following hilarity:

We also run CentOS 7 (well, a mix of 6 and 7 ... and 5, but those are legacy systems that I refuse to support since they're long past EOL and the owners had plenty of warning) at work, but to the best of my knowledge none of the servers has a UI...

I'm running a 2013 MBPr and will have the ability to upgrade later this year if I want but have no idea what I'll do. I don't like the modern MBP nearly as much as the one I have, both from a functional as well as aesthetic viewpoint. I spend most my time either in the terminal, Idea, or VS Code, so running Linux is a definite option. Though I haven't run Linux as my primary OS since back like 2003 or 2004 (I think I used fluxbox as my wm at that time), so that would take some getting used to.

Hardware-wise, I'd have to look at the current Federal Government opinion of Lenovo devices, but as of a few years ago we were definitely discouraged from buying them. Dell is the safest and easiest option and it looks like they have a Linux line themselves.

Or who knows, maybe the fall update to the MBP will be good enough that I'll grab one of those.

Also, XCode is ... not great. I think the best thing I can say about it is that it's improved a ton since I first played around with it back in the 2008/2009 timeframe. I DID enjoy learning Swift if it, even if I'm unlikely to ever use the language again.

Man I dunno I just have a lot of xrandr commands stored.

So after 12 years of embedded C and Ada, I'm looking at now supporting an existing C++ code base. I am way more rusty than I realized. Anyone have good suggestions on books or websites that cover advanced to expert level topics? Like are the Stroustrup books good?