Random Tech Questions you want answered.

So is the answer "crypto miners"?

Either that, or really irritated gamers.

pandasuit wrote:

Interesting choice of game since I actually use that one as an example where my good HDR VA TV makes the game look better than my non-HDR TN monitor does.

I think I accidentally picked a (Tetris Effect) level that didn't benefit much from HDR to try first. I played some more this weekend with HDR on and now I get it.

I'm doing some work with the local Scouts BSA and they've asked me about having a looping promotional video in their summer camp store. What's the best way to go about that these days? Their TV set might have playback off a USB stick but I'm not going to bet on it. I know I can do it with a DVD but that definitely doesn't seem like the best answer. Is this a good time to learn about those Raspberry Pi gizmos?

They can't lock down a laptop with a security cable and play it on that? It'll be attended, right? You could run it on an inexpensive very low end one, low risk.

Or buy a cheap flat screen tv (like $200) with a USB port and go to town.

Vargen wrote:

I'm doing some work with the local Scouts BSA and they've asked me about having a looping promotional video in their summer camp store. What's the best way to go about that these days? Their TV set might have playback off a USB stick but I'm not going to bet on it. I know I can do it with a DVD but that definitely doesn't seem like the best answer. Is this a good time to learn about those Raspberry Pi gizmos?

My mid-range plasma from 2009 has USB video playback. I would assume any and all TVs should have that function. I would think a prudent course of action would be to ask them what they have available to connect to it and what the model # on the TV they have is determine it's capabilities.

Definitely another vote for the "plug in a usb stick to a cheap big screen" route.
You can play a video or load up a bunch of pictures and it should automatically do a slide show.

39-42in TVs are in the low $200 range. They may be 1080p but who cares. Often times they are Roku or Fire TVs too.

40" TCL smart tv for $158 It's got a couple of HDMI slots and built in Chromecast, so you can send from a pc/phone/whatever if they have that capacity to do so.

Will TV's loop a video, or just play it once and stop?

If they don't loop it, that would be very annoying for the store staff to have to restart it all the time.

Videos will loop. I am pretty confident that it will either prompt you, be in the settings or auto loop.

I expect it will loop, based on the number of TVs I've seen in stores with video displays running.

Next step is to see whether or not the available TVs are high definition. They invested in a bunch of AV infrastructure when things were booming... in the early 2000s right before the HD changeover. Now that the bottom has fallen out of the Scouting population and COVID still has people scared to come to camp there's barely funds for essentials.

I should be able to find somebody to donate an older HD set though.

Vargen wrote:

I expect it will loop, based on the number of TVs I've seen in stores with video displays running.

Next step is to see whether or not the available TVs are high definition. They invested in a bunch of AV infrastructure when things were booming... in the early 2000s right before the HD changeover. Now that the bottom has fallen out of the Scouting population and COVID still has people scared to come to camp there's barely funds for essentials.

I should be able to find somebody to donate an older HD set though.

If the TV doesn't support it and you can't find a donator to provide one that does there is a Raspberry Pi image for that. It even appears to support the Raspberry Pi Zero (which only costs like $5, but you do need a few adapters as well).

So Intel seem stuck on their 14nm fab process which is part of the reason why they losing ground to AMD (who are using TSMC’s fabs, which can build smaller circuits). As far as I know both Intel and TSMC both use ASML photolithography machines, so I would think they have the same technical capabilities. What is it that Intel are doing wrong or that TSMC are doing so right?

Intel would pay someone a billion dollars for that knowledge.

TSMC's engineers may be the only ones that know, and they're not talking.

Hehe, yes, it is a bit like I am asking you to do Intel’s homework. Narrowing the question. We know that Intel and TSMC have access to the same equipment. So Intel can probably make a wafer with the same size transistors as TSMC, and these wafers don’t make up chips that work (or not enough chips that work).

Has there been any public statements from Intel about where things are going wrong for them? Can they make sub 14nm chips but at far too low a yield? Or is something going wrong elsewhere in the tape out process?

This article may be of interest. It's not at all just a technical knowledge gap. At least 3 major companies are competing for access to the latest process nodes - TSMC (currently in the lead numerically), Intel and Samsung. The number of process nodes determines production yield. Also, since there are design differences, you can't directly compare one company's 7nm to another company's 7nm.

It's complicated.

Robear wrote:

Also, since there are design differences, you can't directly compare one company's 7nm to another company's 7nm.

It's complicated.

Must be complicated because I have never understood how one company's ruler says 7nm and another company's ruler says 7nm but when measured with the first company's ruler the second company's parts are more like 10nm. Do we not have an international standard for what a nanometer is? Pretty sure we do.

Also don't forget that Intel is outsourcing some of it's production to TSMC now.

Each manufacturer's transistor density differs even with the same notional size for their process node. So the term "Xnm" is not really a good measure of processor scale anymore.

So I went down a bit of a Google rabbit hole and there are loads of things different between Fabs than just the ASML photolithography machines. The main ones being the masks that are your input and the chemical formulas than you are activating with EUV light. Both of these are proprietary to the Fab. The ASML machines are really the oven in the Semiconductor Cake.

I did find this cool video where Engadget got to go into an Intel Fab

Robear wrote:

Each manufacturer's transistor density differs even with the same notional size for their process node. So the term "Xnm" is not really a good measure of processor scale anymore.

Within the same manufacturer's lines, they tend to measure consistently. TSMC's process nodes are more marketing-speak than Intel's; but their 5nm process should still be better than their 7nm process, which in turn should be kinda comparable to Intel's 10nm. Possibly a better measure would be transistors per square inch, but marketing teams hate reality more than almost anything else, so that's not likely to happen.

And then, within any given node, you've got varying densities that can be used for different purposes. If you've got an area of relatively slow transistors, those can be smaller, because they won't cause heat problems packed all together. The stuff that needs to run fast is usually done with bigger transistors, spaced more widely. This is happening, for instance, in Intel's current 14nm++++ process; the chips are using much larger transistors than that, because that's the only way Intel can make them run really fast. In exchange, then, their new chips are getting very large and hot compared with their AMD equivalents. This means that it costs more for Intel to make them (because each wafer makes fewer chips), and it costs end-users more to run them, in terms of heat output; the higher-end Intel chips can put out 300 watts for close to a minute before dropping back to their official TDP. Depending on the motherboard maker, those times can be extended a great deal. If I understood Gamers Nexus correctly, the overclock-focused boards can let an Intel chip run at 300W pretty much indefinitely, if the cooling system can keep up.

Basically, everyone is cheating on every number they can, so you have to pay more attention to benchmarks than any of the marketing teams. AMD is less egregious; the 5800X, for instance, has a TDP of 105W, but actually pulls 140 watts by default. They're cheating too, just not as much. And everyone ends up with a slightly different experience with any chip, because they're all self-overclocking, and some chips go further than others.

It's becoming a frustrating field; there is a lot of useless noise and outright fabrications in this market. Trying to discern who's lying less is not a position any of us should be in. This is an engineering field, and we should be able to trust the numbers we get from these companies.

Friends of ours just renovated and isolated their house, and they're having trouble with their wifi signal now. I have a good old LinkSys WRT54GL lying around gathering dust, but am concerned about security. What would be the noob-friendliest way to secure this old piece of hardware, assuming that's still possible? Or should I just keep it in my nostalgia drawer?

dejanzie wrote:

Friends of ours just renovated and isolated their house, and they're having trouble with their wifi signal now. I have a good old LinkSys WRT54GL lying around gathering dust, but am concerned about security. What would be the noob-friendliest way to secure this old piece of hardware, assuming that's still possible? Or should I just keep it in my nostalgia drawer?

I would think that a third party firmware would probably be more up-to-date security wise than the official one so that would be my first step.

Wasn't Tomato the popular choice for replacement firmware for those?

I've used DD-WRT and it worked fine for my needs, though I haven't updated it in a few years. I've heard of Tomato, but haven't used it.

I had a WRT54GL with DD-WRT years back that I used to bridge a wireless signal. Can confirm that it works pretty well on that model. To be fair, this was over a decade ago though.

It seems like most open source projects like DD-WRT or Tomato stopped supporting the WRT54GL in 2010-2013, but FreshTomato might have a build. It's on the supported devices page, and it seems like I should download the generic K26 MIPSR1 mini driver?

WRT54GL? Run screaming.

I used to run DD-WRT on my WRT54GL a loooong time ago but that thing’s been collecting dust for years so my knowledge is way out of date on that model.

These days I can recommend OpenWRT from personal experience. The WRT54GL is apparently not well supported by the latest version due to RAM limitations but if anyone wants to delve into the world of open source firmware for other routers take a look at OpenWRT. I’ve had it running on a couple routers over the years. Modified the source code a little and compiled my own binaries. The community on Reddit and official forums were pretty supportive in my experience.

The WRT54G series was pretty much the genesis of the open firmware projects, but it's almost twenty-year-old hardware. Support for it was dropped because it has only 16 megs of RAM, which is just flat insufficient for a router that has to track thousands of connections. (eg, many server-search routines in games.)

At this point, it's probably best to just toss the thing. Even if you got a micro firmware running, it doesn't support any of the newer wireless standards. (a, x, c, or n.) Throughput would be terrible, and it would take up frequencies that could be put to much better use.

But... 16 Megabyte, that's like 11 floppy disks! High density and all! It was funny to read articles on the WRT54 series from the 2003 period, where they're discussing how luxurious the 16MB is compared to the 4MB of the WRT54S/G.

Thanks for the advice guys, I gave the advice to our friends to look for a real extender instead. I am going to hold on to the WRT54GL, for old times' sake