Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Maybe, but I think that illustrates an important point: He was no expert in how algorithms operate - but neither are the majority of people who are affected by algorithms.

I think the warning the article wants to convey is that the current trends could - eventually - lead to a society in which personal freedom and even laws are undercut by completely intransparent algorithms. Intransparent both because companies are allowed to keep them under wraps (and will absolutely do so) and because even if they were made accessible, only a small group of experts would be able to understand them. Sometimes, not even then as the article shows. I think that is a valid concern.

The last point - that understandability is such a low priority in development that even frequently the designers themselves don't understand what exactly is happening - is a different problem, but maybe less important than the first one.



It may be a matter of trust and practice.

To overcome "intransparency" one may become an expert or trust expert(s) asserting that they did sufficient analysis of the thing in question (here an algorithm) and are confident.

In addition, with time, one sees that the thing does an apparently adequate job and doesn't produce so much patent problems. Such things are at first "feared" and rarely used, then gradually gain adoption/traction (benefiting from positive feedback).

No one can fully understand the whole technical and organisational context of jetliners, but most of us use them because apparently competent and objective experts continuously check all this, and because problems appear to be rare.

This approach may/will fail in catastrophic ways for various causes, either psy (one loses too much autonomy, and incentive) or systemic (the composition of various things may produce an inadequate system).




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: