Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I see your point. Yes, all those types of software are complex. And, maybe that's just the way it has to be. But, take the non-software versions. Sculpting a bust out of clay is certainly a difficult skill, but it doesn't take 3000+ hierarchical options. All it takes is a pile of clay and 2 hands. Maybe add a 1 to 5 tools. Similarly, building lots of wood furniture is a skill but doesn't take 3000+ hierarchical options. It takes just a few tools. Drawing a diagram on paper (Illustrator) takes a pencil, a ruler, maybe a compass.

I just feel like there's some version of these tools that can be 100x simpler. Maybe it will take a holodeck to get there and AI reading my mind so that I knows what I want to do without me having to dig through 7 levels of menus, sub sections, etc....



I think this point trends into HCI (e.g., https://worrydream.com/ABriefRantOnTheFutureOfInteractionDes...) but I think the overarching phenomena you're describing is that custom interfaces are usually superior to adaptions to the mouse/keyboard.

There are many areas where custom hardware interfaces are popular used in conjunction with software:

- MIDI controllers (note both for the piano keys, and for the knobs/sliders, and even foot pedals)

- Jog wheels for NLEs

- Time-coded vinyl https://en.wikipedia.org/wiki/Vinyl_emulation

- WACOM Tablets

All of these custom hardware interfaces accomplish the same thing: They make using the software more tactile and less abstract. Meaning you replace abstract-symbol lookup (e.g., remembering a shortcut or menu item) with muscle memory (e.g., playing a chord on a piano).

So TLDR, the reason that we don't have what you're looking for is that we don't have a good way to simulate clay and wood as hardware that interfaces nicely with software.

Note there's a larger point here, which I think is more what you were getting at. I think people sometimes expect (and I expected this when I was younger), that computers could invent new better interfaces to tasks (e.g., freed from the confinements of physics). Now I think this is totally the opposite, that the interfaces are usually better from the physical world (which makes sense if you think about it, often what were talking about are things that human beings have refined over thousands of years), and that enforcing the laws of physics usually actually makes things easier (e.g., we've been dealing with them since the moment we were born, we have a lot of practice).

Finally also note that custom hardware interfaces only tend to help across one axes (e.g., a MIDI controller only helps enter notes/control data). The software still ends up being complex because people also want all the things computers are good at that real world materials aren't, like redo/undo, combining back together things that have been broken apart, zooming in/out, seeing the same thing from several perspectives at once, etc...

PS I don't even know if the Holodeck or mind-link up would really help here, it's possible, but it's also possible it just difficult for our brains to describe what we want in a lot of cases. E.g., take just adjusting the exposure, you can turn it down, oh but wait I lost the violet highlight that I liked, how can I light this scene and keep that highlight and make it look natural. I don't know maybe this stuff does map to Holodeck/mind-link, but it's also possible that just having tons of options for every light really is the best solution to that.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: