jasonhong
a month ago
One big area of psychology not mentioned in the article that has been seeing a good amount of success is applied psychology with respect to Human-Computer Interaction.
For example, there's a lot of basic perceptual psychology regarding response times and color built into many GUI toolkits in the form of GUI widgets (buttons, scrollbars, checkboxes, etc). Change blindness (https://en.wikipedia.org/wiki/Change_blindness) is also a known problem for error messages and can be easily avoided with good design. There's also a lot of perceptual psychology research in AR and VR too.
With respect to cognitive psychology, there's extensive work in information foraging (https://en.wikipedia.org/wiki/Information_foraging) which has been distilled down as heuristics for information scent.
With respect to social psychology, there are hundreds of scientific papers about collective intelligence, how to make teams online more effective, how to socialize newcomers to online sites, how to motivate people to contribute more content and higher quality content, how and why people collaborate on Wikipedia and tools for making them more effective, and many, many more.
In past work, my colleagues and I also looked at understanding why people fall for phishing scams, and applying influence tactics to improve people's willingness to adopt better cybersecurity practices.
Basically, the author is right about his argument if you have a very narrow view of psychology, but there's a lot of really good work on applied (and practical!) psychology that's going on outside of traditional psychology journals.
mnw21cam
a month ago
As a counter-argument, HCI was investigated pretty thoroughly in the 80s and 90s, and operating systems of the time actually had the results of that well implemented in them. I feel that modern OS developers seem determined to throw away all these lessons.
Don't get me wrong, I think the modern HCI on mobile phones is remarkably good. But I haven't seen any improvement (except maybe the mouse scroll wheel and having a higher resolution screen) on real computer interfaces since the 90s.
And then you have some real useful psychological theories on attention and user-guiding that are used for evil to create antipatterns. I don't think we're making progress.
SkyBelow
a month ago
I think we should be careful to distinguish the question of if we are growing knowledge and the question of if we are using the knowledge (and if we are using it positively). If we aren't using it, there is an interesting question of why, but I think there should be a clear difference between not finding knowledge and not utilizing the knowledge we find.
AStonesThrow
a month ago
I've a theory that most UX/UI developers started in their youth as gamers, especially in "twitch" genres, because many interactions for me are now closer to playing Descent than typing a paper into Wordperfect.
gspencley
a month ago
> Don't get me wrong, I think the modern HCI on mobile phones is remarkably good.
One of the challenges of psychology is individual variation. Humans have more in common with one another than we have differences, but individuality is a major factor that forces psychologists to look at things statistically unless they are specifically trying to understand or control for individual variance.
I bring this up because my personal subjective opinion is that HCI on modern mobile phones is absolutely atrocious and I don't use a smart phone as much as most people as a result.
I think that when it comes interacting with a tool, what you are accustomed to makes a huge world of difference. I grew up with Desktop computers and laptops. With keyboards, in other words. As a coder and a *nix "power user", I like command line interfaces. I like being able to tweak and customize and configure things to my liking. When I have to use Macbooks at work, it has been soul crushing to me while for others they absolutely love the UI of MacOS.
I also remember the shift of the mobile revolution. A lot of us at the time were starting to get very annoyed by the creep of mobile design conventions making their way into non-mobile contexts. At the time it was understood that those mobile design decisions were "forced" as a result of the limitations of a mobile device, and it was clear that applying them to non-mobile contexts was a cost-cutting measure (mobile first, in other words).
Although well designed iconography can transcend language barriers and facilitate communication, I find that the limited resolution of a smart phone screen forcing designers to use glyphs instead of written text is very confusing to me. I mean, don't get me wrong, I would love to learn ancient Egyptian, but it is often far from intuitive or obvious what these hieroglyphs on the screen are meant to communicate to me. In other words, the iconography is not well designed IMO. At least not in a way that creates an intuitive experience FOR ME.
But a kid who grew up in a world of smart phones is going to be able to navigate them intuitively because they have years of learning what those esoteric glyphs on the touch screen are. They've had years of "typing" out text messages on a tiny touch screens.
On a good mechanical keyboard I can type upwards of 117wpm before I start making mistakes. When trying to text my wife one sentence I need to put aside an afternoon out of my day to get it written correctly. I could get started on how awful auto-correct is but everyone knows this to the point where it's become a cultural meme. Sorry, auto-correct turned "Can you grab me some milk while you're there?" into "fyi the police are here with a search warrant."
So yeah, big tangent off of "HCI on mobile phones is remarkably good." Maybe it is in a relative sense and is as good as it can get... I mean we've had years to iterate and make improvements. But I suspect that a lot of it has to do with people just learning and getting used to haphazard design decisions that just became the defacto for mobile because the tech industry (and business at large if we're being honest) loves to copy.
mnw21cam
a month ago
I also was raised on using a keyboard to interact with a computer. I agree with a lot of your points - the UI on a mobile phone is not very good at doing text-based stuff, but I think that's OK, because I shouldn't be trying to do large scale text-based stuff on such a tiny screen with a tiny input area.
What works well on the phone UI is the way that the touchscreen has been integrated well into it, and the various gestures are mostly highly intuitive as to what they do (although if we could stop maps rotating when we try to zoom in/out with the pinch gesture, that'd be lovely thanks).
The problem comes when trying to apply the mobile phone style UI to a real computer with a keyboard, large screen, etc. That's just awful, but it appears to be the route that UI designers are galloping down these days.
lukev
a month ago
Interesting. I agree with you about being most comfortable with a desktop/keyboard interface, but have the exact opposite opinion regarding macs.
IMO, OSX is the perfect platform for a keyboard-driven power user. It's unix/BSD based, so software works mostly the way you want it to, but unlike Linux it "just works" without endless fiddling. I don't use the OS UI much at all: Spotlight lets me open any app with a few keystrokes. All my time is spent in the terminal or the browser.
ndriscoll
a month ago
I've done almost no fiddling on NixOS in the last 7 years. People fiddle on Linux because they like to fiddle. My experience is it absolutely Just Works. By contrast, I've had OSX at work delete my data after one update and corrupt its install after another.
katzenversteher
a month ago
This is especially important in industrial settings. If a machine operator makes a mistake it's not just expensive, it can cost lifes. There where instances where operators actively fed fuel into fires because they misunderstood the situation displayed on the HMI.
Some time ago I found a really nice presentation about the ISA 101 standard covering this topic. The basic idea is: The HMI looks boring everthing is okay, if something goes into a dangerous direction colors and other elements are used to draw your attention.
katzenversteher
a month ago
yowzadave
a month ago
> there are hundreds of scientific papers about collective intelligence, how to make teams online more effective, how to socialize newcomers to online sites,
I'm curious what research there is about how to create better-socialized groups of people in general; obviously some cultures are more successful in certain areas than others, despite starting with basically the same human genetics--is there any evidence that a culture can learn/adapt in intentional pro-social ways? How does a society learn to be less corrupt over time? How do people decide to stop littering/speeding/parking illegally? How does a society develop a respect for their environment, for their neighbors, for future generations, etc.?
jasonhong
a month ago
This is a really great question, and well beyond my areas of expertise. What I can point you to is this excellent book by my colleague Bob Kraut and several of his colleagues, entitled Building Successful Online Communities: Evidence-Based Social Design. It summarizes a lot of empirical research into design claims, about how to socialize newcomers, increasing contributions, quality of contributions, and more.
https://direct.mit.edu/books/monograph/2912/Building-Success...
You might also look into research on pro-social behaviors. https://en.wikipedia.org/wiki/Prosocial_behavior
One of my favorite books that I learned about from my colleagues is Influence by Robert Cialdini. It looks at how to use known social influence tactics to change people's behaviors. Ideally, these would be used for things that society widely regards as positive (e.g. less littering), though these have also been used for phishing attacks and other dark patterns.
scotty79
a month ago
It's really telling how it's much easier to progress when things that you are working on are directly measurable rather than self-reported or estimated through proxies.
Also progress in any science is contingent on progress in technology. There's only so much you can figure out before you'll need new, more precise way of measuring things to go on any further.
SimpleMinds
a month ago
> perceptual psychology research in AR and VR too
That sounds interesting, would you mind sharing where would you point me if I wanted to follow up with latest research?
Something like arxiv but for psychology? Unless it' only in the magazines ("Psychology today")? I'd be happy to hear the magazines names too, if you'd be so keen to share.
Thank you very much!
jasonhong
a month ago
I'm not an expert in AR and VR, but I can point you to papers by two of my colleagues who know a lot about this space.
David Lindlbauer is a faculty at CMU who applies a lot of perceptual psych to his research on VR. https://scholar.google.com/scholar?hl=en&as_sdt=0%2C39&q=dav...
Roberta Klatzky is a perceptual psychologist that has done a lot of work on haptics. One of her ongoing projects is augmented cognition through wearables, e.g. giving people instructions in heads up displays based on the current state of things (e.g. it looks like you successfully removed the lug nuts, here's your next step in changing the car tire). https://scholar.google.com/scholar?hl=en&as_sdt=0%2C39&q=rob...
SimpleMinds
a month ago
Thank you! Checking!
hinkley
a month ago
In the broader category of cognition, I think we understand a bit better how people rationalize their decisions. How many things we do almost entirely on pure reflex and then manufacture a story that explains it without sounding crazy or just saying “I don’t know.”
raverbashing
a month ago
My suspicion is that in these areas, of "nuggests of knowledge" psychology studies kinda work well and can be applied piecewise to it.
But I feel that anyone that think psychology will be fully predictable, or even up to the standards of medicine today will be for a disappointment
(but oh well, they can still run their experiments on Grad Students or Amazon MK workers and get another grant)
silvestrov
a month ago
And medicine is like "we don't know why this medicine work, just that it mostly does", so it is not much of a standard.