I’m still waiting for the other shoe to drop on USB-C/Thunderbolt. Don’t get me wrong - I think it’s a massive improvement for standardization and peripheral capability everywhere. But I have a hard-used Thinkpad that’s on and off the charging cable all day, constantly getting tugged in every possible direction. I’m afraid the physical port itself is going to give up long before the rest of the machine does. I’m probably going to need Louis Rossmann level skills to re-solder it when the time comes.
Edit: I’m also wondering if the sudden fragility of peripheral connections (e.g. headphones, classic iPod, USB mini/micro) and the emergence of the RoHS standard (lead-free solder) is not a coincidence.
On my Thinkpad the ports where both soldered to the mobo, unlike some random other USB daughterboard. Really annoying, on my T430 the port is a separate piece and can be easily replaces with a cable.
But no, USB-c is pretty tough for me, when done right. But its still too small for no reason in Laptops.
I sort of miss the screws too but it’s so much better when a cable accidentally gets yanked and it just comes right out instead of transmitting the force into whatever it’s attached to.
All those new video standards are pointless. VGA supports 1080p at 30Hz just fine, anything more than that is unnecessary. Plus, VGA is easier to implement that HDMI or Displayport, keeping prices down. Not to mention the connector is more durable (well, maybe DVI is comparable in terms of durability)
VGA is analog. You ever look at an analog-connected display next to an identical one that’s connected with HDMI/DP/DVI? Also, a majority of modern systems are running at around 2-4 * 1080p, and that’s hardly unnecessary for someone who spends 8+ hours in front of one or more monitors.
I look at my laptop’s internal display side-by-side with an external VGA monitor at my desk nearly every day. Not exactly a one-to-one comparison, but I wouldn’t say one is noticeably worse than the other. I also used to be under the impression that lack of error correction degrades the image quality, but in reality it just doesn’t seem to be perceptible, at least over short cables with no strong sources of interference.
Really, what “normal people” use cases are there for a resolution higher than 1080p? It’s perfectly fine for writing code, editing documents, watching movies, etc. If you are able to discern the pixels, it just means you’re sitting too close to your monitor and hurting your eyes. Any higher than 1080p and, at best you don’t notice the difference, at worst you have to use hacks like UI Scaling or non-native resolution to get UI elements to display at a reasonable size.
Shaper text for reading more comfortably and viewing photos at nearly full resolution. You don’t have to discern individual pixels to benefit from either of these. And small UI elements like thumbnails can actually show some detail.
You had 30Hz when I read your comment. Which is why I said what I said. Still, there’s a lot of benefit for having a higher refresh rate. As far as user comfort goes.
At least they had screws? I dont trust HDMI or even worse USB-C. Still using VGA monitors with adapters, never broke a single plug.
I’m still waiting for the other shoe to drop on USB-C/Thunderbolt. Don’t get me wrong - I think it’s a massive improvement for standardization and peripheral capability everywhere. But I have a hard-used Thinkpad that’s on and off the charging cable all day, constantly getting tugged in every possible direction. I’m afraid the physical port itself is going to give up long before the rest of the machine does. I’m probably going to need Louis Rossmann level skills to re-solder it when the time comes.
Edit: I’m also wondering if the sudden fragility of peripheral connections (e.g. headphones, classic iPod, USB mini/micro) and the emergence of the RoHS standard (lead-free solder) is not a coincidence.
On my Thinkpad the ports where both soldered to the mobo, unlike some random other USB daughterboard. Really annoying, on my T430 the port is a separate piece and can be easily replaces with a cable.
But no, USB-c is pretty tough for me, when done right. But its still too small for no reason in Laptops.
do you live ON train tracks? how often is shit just falling out around you? usually a pretty cozy fit on most things imo 🤔
do you like the display port push tab? I feel like many of those are a PITA for real
Hate it. Though there is one that’s worse.
The mini-DP retention clip. There seems to be either wide and narrow variations or simply on-/off-spec variants.
Those clips just jam right in the back plate of the video card.
I sort of miss the screws too but it’s so much better when a cable accidentally gets yanked and it just comes right out instead of transmitting the force into whatever it’s attached to.
Tell that to the USB ports on my laptop.
Good news, USB-C has two formats with screws: 1 on either side like VGA or 1 on top. Though I’ve never seen them in real life.
My display port cable has a clip that you have to press to remove.
Why are you using VGA when DVI-D exists? Or Displayport for that matter.
Because VGA used to be a standard and all monitors I had lying around are VGA only
Kudos for not just trashing them.
Why should I? Full HD and working well, no reason to do so, new displays are 100€+ which is freaking expensive for that improvement
Because there’s plenty of used monitors to be had out there that have DVI on them in some capacity for very reasonable prices.
For instance I just purchased 4 x 24inch Samsung monitors for $15 USD each.
All those new video standards are pointless. VGA supports 1080p at 30Hz just fine, anything more than that is unnecessary. Plus, VGA is easier to implement that HDMI or Displayport, keeping prices down. Not to mention the connector is more durable (well, maybe DVI is comparable in terms of durability)
VGA is analog. You ever look at an analog-connected display next to an identical one that’s connected with HDMI/DP/DVI? Also, a majority of modern systems are running at around 2-4 * 1080p, and that’s hardly unnecessary for someone who spends 8+ hours in front of one or more monitors.
I look at my laptop’s internal display side-by-side with an external VGA monitor at my desk nearly every day. Not exactly a one-to-one comparison, but I wouldn’t say one is noticeably worse than the other. I also used to be under the impression that lack of error correction degrades the image quality, but in reality it just doesn’t seem to be perceptible, at least over short cables with no strong sources of interference.
I think you are speaking on some very different use cases than most people.
Really, what “normal people” use cases are there for a resolution higher than 1080p? It’s perfectly fine for writing code, editing documents, watching movies, etc. If you are able to discern the pixels, it just means you’re sitting too close to your monitor and hurting your eyes. Any higher than 1080p and, at best you don’t notice the difference, at worst you have to use hacks like UI Scaling or non-native resolution to get UI elements to display at a reasonable size.
Shaper text for reading more comfortably and viewing photos at nearly full resolution. You don’t have to discern individual pixels to benefit from either of these. And small UI elements like thumbnails can actually show some detail.
You had 30Hz when I read your comment. Which is why I said what I said. Still, there’s a lot of benefit for having a higher refresh rate. As far as user comfort goes.
Okay, fair point, sorry for ninja-editing that.