Random Tech Questions you want answered.

yeah i dont have another micro c to plug power into unfortunately. i could return this thing, but all the stores have a long line right now to get into them and i'm not sure if i could find anything any better.

ive actually had another usb-c docking station break on a work computer i had 2 years ago. are these things just destined to suck?

FiveIron wrote:

yeah i dont have another micro c to plug power into unfortunately. i could return this thing, but all the stores have a long line right now to get into them and i'm not sure if i could find anything any better.

ive actually had another usb-c docking station break on a work computer i had 2 years ago. are these things just destined to suck?

Yeah, we've had mixed results with them at work with our MacBooks as well. A LOT of them really suck. We've had HDMI ports blow out, casings fall apart, cables break... and those are the ones that at least worked for a while.

I found this one from Anker that I would put faith into (it helps that they list their power input capacities). Anker tends to make pretty solid products, and the only thing that would give me a little worry would be the line where it says the dock requires 12W to operate with 60W max power input. I'm not sure if that's taking into account the power requirements for both USB 3.0 ports being used (even if not there's still about 3W of room for your keyboard/mouse which should be enough power), but this one takes 85W max power input and would likely be fine. For that one your charger can provide 65W of power, the dock takes 15W, your laptop will take about 45W, leaving another 5W of headroom for peripherals.

The ones Dell makes for their laptops are sturdy and durable. Of course, you can’t use them with anything else...

My Bro in law just called and wanted to know how to share an application’s window to a tablet.

His use case:

He is a dentist and uses an application on his Windows 10 computer to view the inside of a patient’s mouth with a camera. He wants the patient to be able to see the same visual on a handheld tablet/screen that the patient holds or a technician holds for the patient.

Basically, kind of how Teams, Discord, Slack, etc. allow screen sharing.

He doesn’t care if the secondary device is a tablet like an iPad, a PC tablet like a Surface, a dumb terminal, etc. Whatever the solution is, it needs to be:

- real easy to use
- software that can be used in a commercial setting
- secure, ie. not exposed where others can see the share

The first thing that comes to mind for me is him getting a Surface Go (or similar device) that supports being the target of a Miracast - IF his current PC supports Miracast. It’s a cheap solution that doesn’t rely on extra software and is easy to use. I’ve had reliability issues with Miracast, however.

What suggestions do you guys have? The least amount of moving parts and ease of use are important.

-BEP

There a few existing products that do this:

https://www.splashtop.com/wiredxdisplay
https://www.duetdisplay.com/

I’ve used Duet in the past and it worked well.

I use Duet as well for Windows <-> iPad and it works decently.

For reference, my dentist has wall-mount screens that show the video live. He uses them and the patient can see them at the same time. Works fine when mounted in the patient’s line of sight.

I appreciate the help, folks. I let him know the various options and we'll see where it goes.

Thanks.

-BEP

The wall mounts are on extendable, flexible arms, I should have said.

For those interested, I got another docking station and am having the exact same problem... something systemic is going on with this razer stealth...

i think the next attempt i make i will just try to forget about the pass through power and look for a usb 3.0 docking station. just wondering, do they even have them so that they can do HDMI? off to find out!

FiveIron wrote:

For those interested, I got another docking station and am having the exact same problem... something systemic is going on with this razer stealth...

i think the next attempt i make i will just try to forget about the pass through power and look for a usb 3.0 docking station. just wondering, do they even have them so that they can do HDMI? off to find out!

I have a Razer Blade Stealth as well and have had tons of issues with USB-C adapters and stuff (never got a dock though) and the main solution I found was to only buy items that listed being Thunderbolt 3 not just USB-C. If it doesn't list being Thunderbolt 3 then assume it is just USB 3.0/3.1 and know you have a good chance of it not working. I have also had good luck with items listed as being Mac compatible. I have the same issue at work in the building with mostly Macs and usually have to get the expensive first party adapters because the third party ones have issues (and F Apple for removing the HDMI ports from the iMacs).

Interesting... I actually didn’t know thunderbolt 3 was different than usb-c. I’ll certainly look into it

Themoreyouknow.gif

They're different standards, and USB-C was updated with an "alternate mode" or something like that which allows the cable to carry a Thunderbolt signal. I don't think it can be USB-C at the same time, I think it has to be one or the other, but I have a vague idea that Thunderbolt can carry USB inside as part of the overall signal.

Originally, it had totally different headers and was an entirely separate protocol. This recent blending is new, and considering how hard it is to even describe, much less implement, it's probably full of bugs.

This is how I understand it:

USB-C is the physical plug, which is currently used for both USB 3 and Thunderbolt 3 protocols. USB-A is the "standard" USB plug and USB-B is most commonly seen as the end you plug into your printer. USB 3 has a different USB-B plug than USB 1 and 2 had, but the A is the same shape.

Thunderbolt is a communications & power protocol that Intel developed; I'm not sure if Apple helped or if they're just one of the very early adopters. Intel has since handed over most or all of Thunderbolt to the consortium that oversees USB, so the USB 4 stuff that's starting to arrive is a generic name for Thunderbolt 3.

so the USB 4 stuff that's starting to arrive is a generic name for Thunderbolt 3.

To my understanding, TB3 and USB4 are different protocols. I think a connection starts in USB 4 mode, and then can negotiate to switch to Thunderbolt protocol instead. I vaguely recall it being called something like an 'alternate data mode'.

I think what that means, in practice, is that two devices can just share arbitrary data in any format they want, they just negotiate the format first. (eg, "I speak Thunderbolt 2", "so do I", "let's switch".... and then the data stream between those two devices becomes Thunderbolt 2 format.) Any device in between probably won't understand the data stream, but will pass it through undamaged.

Thunderbolt is one alternate data mode for USB4, and DisplayPort is another. Presumably, there could be more.

edit: It's possible that an alternate-mode stream has to be a direct-connect between devices. Maybe it won't go through a hub? And if it does, will it multiplex regular USB signals too? I have no idea.

"Universal" Serial Bus.

USB4 is going to be such a mess.

Malor wrote:

USB4 is going to be such a mess.

As is USB tradition.

Vargen wrote:

This is how I understand it...

Great, now do USB 3.2 Gen 2 2x2 CCX2 with Top Gear wing

This is why I use Bluetooth for everything...said no one ever....

Well, said one person ever, pandasuit on GWJ.

I don't even get HDMI standards anymore.

pandasuit wrote:

This is why I use Bluetooth for everything...said no one ever....

They said it, but people had to wait 100ms to hear it, and only got about half of it.

I saw some iCloud chatter from a few months ago via Search just now, so I'm hoping folks who use iCloud for photos will be able to weigh in with a suggestion, because Googling's not turning up anything promising.

We are an Amazon Prime household and have enjoyed unlimited Photo storage for a long time now. We have about 200k photos (a percentage of which are garbage/auto-saved that we need to clean up, and/or duplicates - I'd imagine 190k actual). In addition, I also pay for 2TB of video storage which isn't free for Prime customers at $120 a year, and we have about 1.2TB of videos thus far.

My wife and kids have phones now and I've come to find out that in order for us to upload videos from any device to Amazon, they have to be logged in as me, as the video storage allotment doesn't apply to their accounts even though they're a part of my Amazon family. So in order to upload videos, they need to either send them to me or have me login to their phone or laptop, which I don't want to do all the time (and don't want to always have them logged in as me either, because I'm secure like that).

I thought I was annoyed enough by this to pay for iCloud storage and test life using it instead of Amazon Photos, especially since we're all on Apple devices at this point anyhow (we have iCloud turned off so the out of space warnings don't always appear), yet I have two concerns:

1. Moving everything over feels like it would be a nightmare. No direct APIs or anything, so I think I'm looking at downloading everything to a Mac and uploading it again? In order to do that, I'll need an external drive and a long time of babysitting to make sure stuff made it over, but whatever.

BUUUUUT, the bigger problem may be...

2. As I read more about iCloud, I've come to learn that it's not just a "send it to your cloud account and you can delete it off your device" sorta deal. You have to keep everything, or at least a lower-res version of everything, on your phone, laptop, whatever device you have.

Is this right? If so, I'm not ever going to be able to keep all that - even lower-res versions of that - on my phone. Does this effectively make the decision for me, or is there a way you can leverage iCloud as a backup destination without it constantly being synced to all your devices?

Thanks in advance!

dejanzie wrote:

I don't even get HDMI standards anymore.

A significant chunk of my problems at work boil down to "HDMI is bullsh*t."

On that note, I have a weird issue.

I tried hooking up my PC to my new 4K TV and it kept resetting, either blinking in and out every 3 seconds or merely just acting weird every 3 seconds (with several indicators that the display is resetting something each time).

After trying different HDMI ports, trying different devices in those ports, and also buying a new certified HDMI cable, I found the solution: Physically disconnecting my monitor from the video card. Now the TV works perfectly with my PC.

Clearly there is something my video card (GTX 1080) or the PC does not like about me having my monitor connected with displayport at the same time as my TV via HDMI, as even changing the display settings to only use the TV display does not stop the connection issue. The monitor is 1440p so I'm not sure if that's related to the issue, but as I said, deactivating the monitor in the settings completely or duplicating the displays at 1440 or 1080 do not solve the problem.

I'm going to get a DVI cable for my monitor instead of display port to see if that makes a difference, but I'm curious if anyone has any idea of some setting I'm not thinking of because pulling the cable out of the back of my monitor every time I want to game on my TV is obviously not the most ideal scenario.

One suggestion I've seen is that this is due to energy saving modes. In that case, going into NVIDIA control panel - Manage 3D settings - Global Settings - Power Management Mode and setting Prefer Maximum Performance will prevent the card from kicking into an energy saving mode and powering down a DisplayPort. Easy to check, anyway.

Could it be that the card is just not capable of driving all the pixels when both are connected?

It should be able to do 2x 4K monitors for desktop stuff, but you'd probably want to limit it to one for graphics intensive games. But just to run, say, a browser on the monitor and a movie on the 4K it should be fine.

It could be the HDCP copy protection. If the monitor doesn't do HDCP, while the TV does, weird sh*t can ensue. All the outputs have to support HDCP, or none of the HDCP-enabled ones will work right.

Worse, on new HDMI connections, I think they have a new HDCP, because the old one was permanently compromised. So if the monitor only supports the old mode, that could also be the issue.

In other words, it may be doing this on purpose, your hardware refusing to work correctly because you could potentially be a filthy pirate.