Nov 11

Project Black Mirror
The guys behind Project Black MirrorĀ have recorded brain wave activity with ECG pads, matched the incoming patterns to pre-saved digital patterns saved on a MacBook, then fed the matched commands to a speech synthesizer chip that translates the command to Siri. This way you can give Siri commands without talking or touching the screen of your iPhone 4S. A full “How they did it” and movie can be found after the break.

How we did it !
1. ECG pads provide raw skin conductivity / electrical activity as analogue data (0-5v).
2. This is plugged into the Arduino board via 4 analogue inputs (no activity = 0v, high activity = 5v).
3. The Arduino has a program burnt to its EPROM chip that filters the signals.
4. Josh trained the program by thinking of the main Siri commands (“Call”, “Set”, “Diary” etc.) one at a time and the program where we captured the signature brain patterns they produce.
5. The program can detect the signature patterns that indicate a certain word is being thought of. The program will then wait for a natural “release” in brain waves and assume the chain of commands is now complete and action is required.
6. The series of commands are fed to a SpeakJet speech synthesiser chip
7. The audio output of which simply plugs into the iPhone’s microphone jack.

So simple, we still can not believe it really works

Source [9to5mac]

\\ tags: , , , , , ,

2 Responses to “Project Black Mirror: Siri Controlled By Brain Waves [video]”

  1. Chris Says:

    Wow, imagine if Apple could pull that off wirelessly, it would definitely give them an advantage over the competition.

  2. Philip Galanter Says:

    This is a fraud. The SpeakJet chip is not wired up to anything at all (the way it is positioned on the prototyping board merely shorts out all the pins). There are lots of other reasons to be suspicious but as someone who has actually designed circuits using the SpeakJet the above is sufficient for me to reject this claimed project.

Leave a Reply