If we can ever get to the stage where I can just think and it gets saved as a text file, that would be more exciting to me than "metaverse" and quantum computing together.
The industrial revolution ended up destroying most of our biosphere. It started an extinction event, a rapid and irreversible climate change making our planet hostile to life.
And yet people are better off than ever, with more people having been raised out of abject poverty than anyone could have imagined at the onset of the industrial revolution.
Yeah there’s an extinction event but it isn’t entirely the industrial revolution to blame since people have been driving animals into extinction for thousands of years starting, at least, with the megafauna extinctions.
It’s sad but organisms have been getting out competed, and reshaping the atmosphere and climate, for billions of years. We can pretend humans are some anomaly and that we aren’t part of nature but that isn’t the case. This extinction event will give rise to new organisms that will fill new niches just like every other one has.
That isn’t really true, we have alternatives (or have theories for alternatives) for pretty much everything we need to keep going the problem is political but over long enough periods of time the politics will likely be irrelevant.
Okay sure, but the industrial revolution has been good for pretty much every nation on the planet. Since just 1990 1.2 billion people have been risen out of abject poverty[1]. Since 1900 the number of people not living in abject poverty has increased from a few hundred million to over 6 billion[2].
So you can quip about me being American, I am, but that doesn’t change that the industrial revolution has brought untold wealth and economic, if not political, freedoms to a majority of people on the planet.
It certainly has the potential to result in a dystopia, but it could also become a utopia.
Today, in the USA, corporations are incredibly powerful, surveillance technology is growing faster than legal frameworks or consumers can keep up with, and there's little expectation of or coordinated resistance from uninformed, irrational, impotent consumers or effective regulation from our partisan and captive governmental agencies. That hasn't always been true - at other times, colonial governments, monarchies, feudal leaders, tribal leaders, or religious leaders have held power. It probably won't be true in perpetuity.
The trick is to make sure that we only open Pandora's Box of brain-computer interfaces (or better and also more frighteningly, full-brain upload and emulation) technology when society is ready...
Edit: I'm reminded of qntm's excellent short story "Lena" at https://qntm.org/mmacevedo - about "the earliest executable image of a human brain". I won't spoil it, other than to say that it's something of a horror story, depending on your worldview and the depth of your imagination.
"Lena" is existentially terrifying. Certainly interesting.
> The trick is to make sure that we only open Pandora's Box of brain-computer interfaces (or better and also more frighteningly, full-brain upload and emulation) technology when society is ready...
I don't think this will be the case. We unleashed social media on portable devices immediately. That's a strong suggestion that there will be no pauses to think about it.
It could, of course, become anything in between - slightly better, slightly worse, no change, anywhere on the spectrum.
But I, like a great many people both on this website and worldwide, spend most of my working life entering data into a computer (and reading it out of a monitor), and derive a great deal of utility from data entered into a computer, so it's reasonable to assume that it will result in significant changes.
you're right, and the link you shared was moving.. personally I think things will move much faster than we think, and much sooner than in the mmacevedo story
Stalin's trains ran on time. It was still an authoritarian dystopia. The thing about dystopian qualities is that they tend to nullify the meaning of any potentially otherwise good things.
It was not a dystopia. It had dystopian qualities but millions of peoples lives were better under Stalin then they had been under the Imperators. That’s the thing, even the worst nations we know of are not dystopias as dystopia is the opposite of utopia neither of which are attainable in the real world.
Interesting project, though a lot of prior open source work exists for very similar devices[1]. Couldn't get access to the research paper to read through due to a paywall, so hopefully the Github gets updated with more information.
I came here to ask how it was different from OpenBCI.
It might end up slightly cheaper. They use the ADS1299 Analog-to-Digital Converter, which is $40 (DACs are usually one of the largest costs of BCIs). So it seems like the all-in cost won’t be much cheaper than OpenBCI.
Hopefully I’m wrong and someone will correct me because I’d love to see a research-grade BCI for ~$100-$200 (not sure about Muse 2 but my neuroscientist friend called Muse 1 “a fun toy, but still a toy”… sadly that’s the closest I’ve found so far.)
didn't some grad students at mit wrap the intan adc/amplifier with an fpga and open source board design like 10 years ago? i think the component prices came in under $200.
dunno about signal or clock quality though. (i suppose that's a function of the intan package)
ok, i think maybe it was open epyhs, and it looks like it's 5k euro for a 64 channel starter system. (with the intan devices being the most expensive, costing on the order of 500 to 1000 per 32 channel package)
still an order of magnitude cheaper than the systems i knew and open source to boot.
As far as I can tell, OpenBCI already is that thing.
It's just very hard to sustain a business on one-off low volume hardware sales so the prices on the official site are relatively high compared to BOM (but perfectly reasonable and necessary to sustain further R&D).
It seems many hobbyists will buy off aliexpress while institutions / researchers tend to get the "official" hardware.
If you're happy with 8 channels wired (run laptop off batteries and use a good USB opto-isolator...) then you can get that right now for about 200 USD (not including headware).
(Note: better to get a 32-bit board not 8-bit one).
>It's just very hard to sustain a business on one-off low volume hardware sales so the prices on the official site are relatively high compared to BOM (but perfectly reasonable and necessary to sustain further R&D).
I run a low-volume open source hardware business[1] and honestly don't buy this argument at all.
So many OSHW businesses have absolutely woeful logistics that treats every single person outside of the US (or occasionally EU) as an afterthought.
This both what drives up the cost and pisses people off enough to create a sustainable business model for the boys in Shenzhen.
People aren't going to support your business if you ask insultingly high shipping prices or are constantly out of stock.
That is a neat board and I am having a hard time not impulse buying it(!)
Based on your background I expect you have more insight than me, but I do wonder if they have deliberately abandoned the low end of the market to Shenzhen to avoid cannibalising (much more profitable) institutional sales.
> That is a neat board and I am having a hard time not impulse buying it(!)
Give in to your urges! Only material consumption can bring happiness! :P
>I do wonder if they have deliberately abandoned the low end of the market to Shenzhen to avoid cannibalising (much more profitable) institutional sales.
I wouldn't be super surprised by that. I've been on both sides of the fence there and whether or not the product itself is good value tends to be much less important than whether or not the sales process makes the purchasing department feel important.
From the information that they've published (paywalled article excluded of course) it does not seem like they're going for anything necessarily novel or niche-filling, more just making another option. That's not necessarily a bad thing, as (coming from someone who works in BCI) more tech is always welcome, even if it's in what I and most other people I talk to consider a dead-end technology (EEG).
Also funny you bring up the Muse, as I actually worked there a few years back. The Muse 2 is a reasonable increase in signal fidelity though with a tradeoff in that it has more issues in adverse environmental conditions (sweat etc). The main difference between the generations is in usability on the consumer end, which is the target market for the device. It was never intended to be a research tool, and it excludes many of the important electrode locations for research. For that reason, calling it a toy is roughly accurate in my opinion.
As a graduating senior in CS who is super interested in working in BCI, may I ask how you got into the field? I would love to work somewhere like Neuralink but it seems that most of these companies are not hiring new grads (understandably, perhaps). I might just get this or openbci and do a project on my own, to start.
Id you're interested in working with EEG, the advice of the other poster is worthwhile. I've transitioned away from EEG as I don't believe it's a future-proof technology. I got into the field through a series of lucky internships during university and a whole lot of networking got my resume into Neuralink before the destealthing last year. My advice is to apply everywhere you can, be passionate, and reach out to people. There are more opportunities out there than you think.
Learn EEGLAB in Matlab, using existing online EEG datasets. Don’t collect your own, not worth it. Join the EEGLAB list. Figure out how to use some machine learning classification algorithms on the datasets. (There are walk through). Do something cool and share. Then use their devices.
in the initial version of the article, there was information about an additional board with 8 sensors, but that information had to be deleted (editors)