I ain’t no programmer, but I was a toolmaker and ME that designed machines to be used in factories. I learned to not be surprised at how operators could find new and interesting ways, (sometimes dangerous), run the machines I designed and built. They did things I never would have dreamed possible or meant with them.
This triggers me to my very core.
I have one you should love. And by that I mean hate.
Over a decade ago I was installing some equipment I designed, training the operators, etc. There were electrical and software components to the system, and it was used to test products coming out of final assembly.
The very first thing that happened was the operator taking the stapled-together stack of detailed instructions I gave them, dropping it on the work bench, and using it as a mouse pad to start aimlessly clicking around.
actually, i would like to counter this. Developers often times put together shitty UIs that are hard to navigate (mostly because UI design is bad and we’ve been living with floating WMs for the past 30 years so nobody knows any fucking better for some godforsaken reason)
But it’s no fault of the user for using a shitty interface if it was designed to be used in that manner, by the person who built it. This is why so many people like CLI, it’s impossible to fuck up. You can use it wrong as a user, but that’s because it has specific syntaxing. It’s designed to only be used in that one manner, where as most graphical applications are designed to be “generally applicable” for some reason, and then when a user uses it in a “generally applicable” manner, somehow that’s now the wrong way to use it?
I’d argue floating wms are more intuitive and some can still tile pretty well if you want that
floating WMs are intuitive, but the problem is that they’re an incredibly mediocre solution, and the way that problems are often solved around one, is just entirely asinine. Let’s build ten different ways to do the same thing, now we have 10x the code to build and maintain, and it’s 10x more confusing to the end user who probably won’t know about half of them, because 90% of our documentation is redundant!
Tiling WMs have significantly less issues with this, because they often have a very strict set of management rules, and only those. Nothing more.
People screw up CLI’s all the time (looking at you Google Cloud). They (used to) insist on using my installed python which automatically upgrades and breaks the CLI. Good job python. Good job Gcloud.
i’m not sure that’s a CLI problem, sounds more like an application problem from what i’m hearing.
Exactly! All applications can be shit, not just web sites.
Does anyone have the template for this meme?
Damn every day it gets worse to find images u can embed… Tried more than 10 before, half is gone, other half redirects to site/post… Anyway here’s one without the text:
Thanks a lot!
I do QA for a living. If that’s the end result, it wasn’t intuitive. 😅
I agree to a point, but users also do some weird stuff that you just can’t predict sometimes.
and this is an incredibly valuable reason to have a technically simple UI, because it fundamentally limits the amount of stupid shit people can do, without it being the fault of the designer.
And some of that is because some users have been trained on some other bad UX.
And that’s precisely why QA still exists and why it shouldn’t be the devs. And yet, you’ll still wind up with weird situations, despite your best efforts!
Yeah.
Any good software developer is going to account for and even test all the weird situations they can think of … and not the ones they cannot think of as they’re not even aware of those as a possibility (if they were they would account for and test them).
Which is why you want somebody with a different mindset to independently come up with their own situations.
It’s not a value judgment on the quality of the developer, it’s just accounting for, at a software development process level, the fact that humans are not all knowing, not even devs ;)
A) People still get paid to do dedicated QA?
B) If you really think that, you must be a noob.
QA is also known as preventing shit from exploding and losing us millions of dollars in the process, or better yet, cybersec. Cybersec is just glorified QA
I guess I’m just being a snob here.
I worked for an actual QA department that produced actual documentation and ran actual full scale QA cycles.
In the past 15 years, I have seen that practice all but fully disappear and be replaced by people who click at things until they find 1 thing, have a verbal meeting vaguely describing it, and repeat 2 to 3 times a day.
IMO, that isn’t QA. It’s being lazy, illiterate, and whiny while making the dev do ALL of the actual work.
a lot of QA has probably been automated, The entirety of SQL for instance, is using an automated testing suite to ensure functionality.
That’s a fair point.
When I departed QA myself, it was in the onset of automation.
In return, when the QA jobs disappeared, I learned basic scripting and started automating BI processes.
So, I would say:
-
I should hope modern QA departments (as I am told they exist) are automated and share both their tests and their results with devs in an efficient manner.
-
I don’t think QA departments really exist today in a substantive way, and if they do, it isnt in as cooperative of a fashion as described in 1.
I still have observed a world where QA went bye bye. Planning? Drafting a Scope of Work? Doing a proper analysis of the solution you are seeking, fleshing it out, and setting a comprehensive list of firm requirements that define delivery of said solution? Offering the resources to test the deliverable against the well documented and established requirements to give the all clear before the solution is delivered?
Doesn’t exist anymore, and modern “QA” is being the lemming who sits in meetings as listens to the management, then schedules meetings to sit and complain at the Dev about how they aren’t “hitting the mark” (Because it was about 4 feet directly in front of them when they published, and is now at 5 erratically placed spaces behind them).
I think it’s probably because we’ve shifted away from shipping software as a product, and onto software as a service. I.E. in the 90s if win 95 irreversibly corrupted, that would be devastating to sales.
But today with windows 11? Just roll it out in one of the twenty three testing branches you have and see what happens, and if shit does break. Just work around it. It’ll be fine. Even if something does happen, you can most of the time, fix it and roll out a new update.
And i also think it’s moved to be more team centric, rather than department centric. A lot of the theory is probably more senior team led type responsibility. While everyone writing the code can chip in and add some as well. Developers knowing how to write secure code helps, so they should theoretically also be capable of QA themselves to a degree.
Also there’s a lot more money in shipping shit out the door, than there is in shipping a functional product, unfortunately.
Thank you for your TED talk defining enshitification.
Middle management bloat.
Edit: Bonus points for
Developers knowing how to write secure code helps, so they should theoretically also be capable of QA themselves to a degree.
Which is straight up just saying “why don’t the devs just do it themselves? I’m busy with meetings to whine back and forth with other middle management.”
-
A) Yes. Large companies have entire departments dedicated to QA, and it’s best not to leave QA to devs, if you can afford it. Dunno what you mean by “still,” since the job never went away.
B) Okay?
Yes grasshopper.
I was a QA for over 15 years.
Then the “Agile” fad ripped through the industry and QA died.
In Agile, QA testing should be involved throughout the whole development process, with QA not just following the development, but supporting it. QA testing should be implemented early and continuously, with constant feedback to developers to ensure that any issues are fixed quickly.
Hmmm…
Which, when put to practice, means QAs become BAs, no comprehensive QA occurs, and when the code is shit because they have no actual QA support and the scope changes constantly with no firm documented requirements, the dev gets fired.
Great model for people who like to sit in meetings and complain.
Horrible model for the people who actually work.
Agile made Management, who had actual Senior Designer-Developers and Technical Architects designing and adjusting actual development processes, think that they had this silver bullet software development recipe that worked for everything so they didn’t need those more senior (read more expensive and unwilling to accpet the same level of exploitation as the more junior types) people anymore.
It also drove the part of the Tech Industry that relies mainly on young and inexperienced techies and management (*cough* Startups *cough*) to think they didn’t need experienced techies.
As usual it turned out that “there are no silver bullets”, things are more complex, Agile doesn’t work well for everything and various individual practices of it only make sense in some cases (and in some are even required for the rest to work properly) whilst in others are massive wasting of time (and in some cases, the usefull-wasteful balance depends on frequency and timing), plus in some situations (outsourced development) they’re extremelly hard or even impossible to pull at a project scope.
That said, I bet that what you think is “The Industry” is mainly Tech companies in the US rather than were most software development occurs: large non-Tech companies with with a high dependency of software for competitive advantage - such as Banks - and hence more than enough specific software requirements to hire vast software development departments to in-house develop custom solutions for their specific needs.
Big companies whose success depends on their core business-side employees doing their work properly care a lot more about software not breaking or even delaying their business processes (and hence hire QA to figure out those problems in new software before it even gets to the business users) than Tech companies providing software to non-paying retail users who aren’t even their customers (the customers are the advertisers they sell access to the eyeballs of those users) and hence will shovel just about anything out and hopefully sort out the bugs and lousy UX/UI design through A/B testing and user bug-reports.
You either never worked with anything that did actual agile (to be fair, most don’t) or you haven’t done development in a long time if you think that.
A devout Kool aid drinker I see.
Did you buy that Kool aid with your story points?
I hear they have a competitive exchange rate to Stanley nickels.
Don’t take this badly but it sounds like you’ve only seen a tiny slice of the software development done out there and had some really bad experiences with Agile in it.
It’s perfectly understandable: there are probably more bad uses of Agile out there than good ones and certain areas of software development tend to be dominated by environments which are big bloody “amateur hour every hour of the day, every day of the year” messes, Agile or no Agile.
That does however not mean that your experience stands for the entirety of what’s out there trumphing even the experience of other people who also work in QA in environments where Agile is used.
When did you retire? Agile has been around for at least 20 years, more like 30 if you count scrum being introduced before agile was formally defined. No matter how critical I am of agile it is hardly a fad at this point
Agile was definitelly taken in with the same irrationality as fashion at some point.
It’s probably the best software development process philosophy for certain environments (for example: were there are fast changing requirements and easy access to end users) whilst being pretty shit for others (good luck trying to fit it at a proceess level when some software development is outsourced to independent teams or using for high performance systems design) and it eventually mostly came out of that fad period being used more for the right things (even if, often, less that properly) and less for the wrong things.
That said the Agile as fad phase was over a decade ago.
Still working.
I stopped being able to find QA work in the early 2010s or so. Converted to BI Developer. Have not encountered a dedicated QA at any of the small assortment of jobs I have had since.
Edit: And fair, despite it being a waste of time cult mentality engineered to make developers suffer and enshitify software quality, Agile got enough Kool aid drinkers to qualify it as more than a fad.
I work at a company whose entire business model is providing QA to other companies. I work directly with some very large, public companies, and some smaller ones. Almost all of them have some form of dedicated in-house QA, which we supplement.
Dunno what to tell you. I do QA for a living. I see postings all the time for QA positions in other companies, and my company has had QA for at least two decades, with the department expanding over the last three years.
I’m not claiming it’s ubiquitous, but maybe you’re just out of the loop.
Bruh I’m a dev who does Agile and we still have a QA department lol
“The only intuitive interface is the nipple. After that, it’s all learned.” — traditional 20th-century folk wisdom.
Some babies have to be taught to nurse…
Bottles have nipples
Can you milk a bottle greg?
If their issue is with latching then a bottles not gonna change that
I’m not a professional baby feeder I just know that when my son wouldn’t latch on a tit they gave us a bottle and he did just fine.
I’m pretty sure that won’t stand in the way of somebody inventing a square bottle nipple and blaming the users for not using it properly.
If they tried opening the door the wrong way, the door is wrong.
Maybe you need better signage. Maybe you need to reverse the direction of the door. Maybe you could automate the door. Or maybe the user is just fucking stupid. 😄
The philosophy is that the user’s intuition is never wrong because that’s what we’re trying to accommodate.
Also, if you have to post a sign, it’s probably broken by design. Users don’t read.
This is very perfectionist. Let me install my doors the way it’s comfortable or pleasing. Where I see a knob I’ll reach. And where I see a “pull” sign I pull, or get contex clues.
There is research for everything, let’s say it’s more comfortable to push and the knob is on the right side for me. I could spend way more time and effort than thia desrves to apeal to that study, “I have great UX”, I’d tell myself. But then I’d show this product on some eastern market where they read in “reverse” and it’ll not be comfortable nor “100% natural” for them. Meaning, I’d fail, my UX’d be horrible for half the planet.
This might be worth for universal things, that are already researched and you don’t need to spend years and a kidney to figure out. Like maybe how are “next”, “cancel” and “back” buttons are next to each other. But I mean… just copy the most recent you used.
You might have noticed at some point that for knobs are universally at the same height and same for light switches in houses that don’t suck.
there’s a difference between trying to open a door from the hinged side, vs designing a door that has 14 different deadbolts, and three latches on it.
One of those is user error, the other is designed complexity generally being a hindrance to the user.
To be fair all “users” got what they wanted so… Success?
“Ugh, it works, but it was overly complicated to get what I needed.”
yeah who the fuck made this meme? A web programmer?
Ah yes, the cable kitties. First the orange one approached the food from the front, and all was well and simple if a little diagonal. Then the white one approached from the left. Now it could have gone around and kept things tidy, but that’s not how cable kitties work. It walked right over the orange cable kitty’s head and started eating. Then when the black cable kitty came from the right, there was only one food socket left. Now this cable kitty could have gone around, but cable kitties always take the shortest path. Up and over the black cable kitty went, and thus the tangle of cable kitties was complete.
I’ve actually worked with a genuine UX/UI designer (not a mere Graphics Designer but their version of a Senior Developer-Designer/Technical-Architect).
Lets just say most developers aren’t at all good at user interface design.
I would even go as far as saying most Graphics Designers aren’t all that good at user interface design.
Certain that explains a lot the shit user interface design out there, same as the “quality” of most common Frameworks and Libraries out there (such as from the likes of Google) can be explained by them not actually having people with real world Technical Architect level or even Senior Designer-Developer experience overseeing the design of Frameworks and Libraries for 3rd party use.
Im a developer and I should not be allowed to wing it with UI/UX design.
Yes you should. I think most comments here are about products that have millions of users where it’s actually worthwhile spending all that extra time and money to perfect things.
For most development, it isn’t worthwhile and the best approach is to wing it, then return later to iterate, if need be.
The same goes for most craftsmanship, carpentry in particular. A great carpenter knows that no-one will see the details inside the walls or what’s up on the attic. Only spend the extra time where it actually matters.
It triggers me immensely when people say “I could have made a better job than that” about construction work. Sure maybe with twice the budget and thrice the time.
Exactly. I’d also like to add, look at Google stuff their ui / ux is routinely horseshit. So don’t tell me there are ui/ux gurus out there GIGAchading user interfaces.
A lot of this shit is trial and error and even then they still fuck it up.
Make it accessible, make it legible and then fine tune it after.
User Research: Exists
Devs:
User Research: Exist
Me: hhhhh