© 2018 Open EyeTap Inc.

Hazerfin
Mar 7, 2018

Rearranged EyeTap

7 comments

Edited: Mar 7, 2018

Here's 4 ideas on how the EyeTap might be rearranged and thereby improved:

 

IDEA #1: Move the Pi to Your Pocket

  • Why: this allows you to reduce the bulk on your head and upgrade the pi from a zero to a more powerful full sized Pi 3 which I suspect we'll need to do any image recognition. You are tethered to a battery in your pocket anyways, so might as well be tethered to the Pi too. Just put the pi on a belt clip or something if your pockets are getting too full (or too warm!).

  • How: Keep the "Spy Camera Flex Adapter" integrated in the glasses. However, the "Spy Camera to RPi flex cable" should be lengthened so that it reaches your hip/pocket mounted Pi3. All wiring from the micro display to the Pi would also have to be lengthened. The tether to your pocket will admittedly be thicker. Maybe wrap the flex cable around the micro display wiring like a burrito? lol

 

IDEA #2: Make an IMU and GPS Standard Hardware for All EyeTaps

  • Why: IMUs allow us to do very interesting things in AR. They can detect your head's orientation in 3D space, and if you have a barometer integrated into it, then your altitude as well. GPS gives your position on earth.

  • Use Cases: If eyetaps share this information with each other, we can then overlay information like a public profile when you look at another eyetap user (e.g. screen name, any information about their interests they'd like to share, etc.) You can also place digital artifacts in the world (e.g. digital post-it notes) with associated GPS coordinates and other eyetap users can see them lying about when they come close to those coordinates. When you look at the night sky, you can overlay names of the stars, much like the Sky Map app does on your smartphone. IMUs can also be used to interact with the UI projected into your eye (e.g. look at a button for 2 seconds to press it). To do anything cool in AR, you need an IMU and, to a lesser extent, a GPS chip in your eyetap.

  • How: Now that we've freed up some room by removing the pi from the glasses, replace it with a tiny IMU chip, preferably one with a barometer (10-dof). The wiring will have to run down to our hip mounted RPi, but all the hardware on our glasses could share a single power wire and ground wire. As for the GPS chip, it could reside with the raspberry pi. Also, alternatively, we could use a separate barometer chip residing with the raspberry pi as well.

 

IDEA #3: Preserve your Peripheral Vision by Moving the Micro Display

  • Why: Our peripheral vision is more important for situational awareness than our vertical visual arc. When we are driving (with navigational directions projected in our eye, of course), we are looking side to side more than up or down so we don't hit other cars or pedestrians. We wear baseball caps without an issue because most of the interesting world around us is on the horizontal plane. We're willing to forego a bit of the vertical plane. Having the bulky micro display sitting on the side of our eye probably blocks our peripheral vision, which is not ideal.

  • How: Move the micro display above the eye instead of to its side. The micro display would then project downwards. The beam splitter would also need to be rotated to accommodate. Lastly, the camera would need to be mounted under the eye, pointing up.

 

IDEA #4: Cover the Left Eye Instead of the Right Eye

  • Why: Most people are right handed and will therefore use their right eye for certain activities like archery, hunting with a scoped rifle, etc. Also, 2/3 of all people are right eye dominant and will use the right eye to look through a peep hole in their front door or use a microscope. Having a bulky beam splitter covering your dominant eye might interfere with these activities.

  • Example: I'd like to use my eyetap to identify stars in the sky. I then want to look at those stars through my telescope. The eyetap overlays the names of stars when I look up at the sky (using that IMU we talked about earlier). I then bend over my telescope and place my right eye over the viewing eyepiece. I subsequently bump my telescope with my eyetap, scratch the beam splitter, and use some poorly chosen words to express my frustration. Had the beam splitter been over my left eye, I wouldn't have had an issue.

  • Even Better: make 2 versions - one for right eye dominant people and the other for left eye dominant people.

Maaaaax - EyeTap Team
Mar 7, 2018

 

 

1. I feel like we have to find a new camera solution very soon... the flex not not very easy to work with. They are expensive and fragile... Maybe we should move towards USB, but that means we might not be able to use the CSI ports on the PI...... need to find a solution for this if we keep using Pi...

 

 

2. We have thought about IMU and GPS for a long time, but we haven't had the chance to make it happen yet. We actually have a logo design for that two modules... see attached..

 

 

Also, in terms of where to mount, maybe we can consider the right side of the EyeTap... I mean, that helps with weight balancing too, right..?

 

3. Very good point! I love how you think from a very practical real life scenario. In old EyeTap designs (1980s), there was a design that has the camera pointing up rather than from the side. Again, see the image below. Maybe we can make a similar design for the Open EyeTap.

 

4. I love this, and I love how you wanna use it... This is actually gonna shorten the length of the wires.. Only concern that I would have: weight balance. The current design has Pi on the left, and if we move the display to the left as well, it will be heavily unbalanced. Maybe we should move the pi to the right, or move it down.

 

Max

Hazerfin
Mar 7, 2018Edited: Mar 8, 2018

Max, great point about balance. Worst case, we could put some dead weight on the glasses to counterbalance if needed.

 

 

Or add more hardware :?) Microphone, speaker, rear view wide angle camera, LED flashlight, etc.

 

Or maybe we could use a strap in the back to keep the glasses in place like this: https://www.amazon.com/ONME-Adjustable-Retainer-Universal-Sunglass/dp/B01CXZ6F3U/

Hazerfin
Mar 14, 2018

Here's a quick mock-up of the rearranged EyeTap:

 

We could alternatively use only half of the eyeglass rim (keep only the portion from bridge of nose to camera), though I worry about vibration.

mann
Mar 14, 2018

Hasan: Max told me all about you!... Yes, top-to-bottom is a great way to go for many situations. Perhaps you can do a quick Fusion 360 model design and send it to us to test out. Keep up the great suggestions == Steve

Hazerfin
Mar 14, 2018

ÖMG, the Mann himself! What an honor!! I'm a huge fan.

 

I've never worked with Fusion 360, but I'll take a crack at it.

poague
Nov 7, 2018

Better ideas to improve the project...how about bluetooth..no wires..Maybe better cameras and hd display..better cpu..upgrade the works..keep up the good works

New Posts
  • anadom
    Mar 26

    Hi, not sure if this is still an active project, but I was wondering what micro display unit is recommended/has been used before. If anyone has any suggestions, I'd really appreciate it! Thanks!
  • mickey
    Nov 17, 2018

    Under what license are the 3d designed eyetap frame?
  • Cayden Pierce
    Nov 6, 2018

    My favourite author, Raymond Kurzweil, the leading innovator, thinker, futurist, claims in two of his books ( The Singularity is Near and How to Create a Mind ) that the future of technology lies in an ever increasing symbiosis between biological intelligence and artificial intelligence. I agree with this vision, and realize that the businesses of the future will be those that take into account the ever growing intimacy that we experience with our machines. This is the aim of my virtual memory product. I envision a system that, when attached to a user (either around the neck or in a headset) records video and audio input. The differing sources of information comes together to become what I refer to as “virtual memories". The information is then connected to other virtual “memories” by analyzing patterns that new information shares with old information. In essence, this technology will align sensory input into a coherent structure, just as our biological memory works, except with perfect recall. This is being paired with multiple technologies, such as face and image recognition, to provide the “scenarios” and connections that is used by the software when making connections between memories. This has been a longheld vision of mine. My new experiences with the OpenEyeTap has started me on producing this vision. I would love feedback and ideas regarding the overall idea. I will write another post in the "Software" section outlining what I have completed thus far and what I will continue to create. Cayden Pierce http://piercetechnologi.es