© 2018 Open EyeTap Inc.

Hazerfin
Mar 8, 2018

UX: User Experience

4 comments

My day job is in the User Experience (UX) field. I lead a team that designs easy-to-use software interfaces. We employ usability engineering activities to prove our designs are easy to use.

 

Seeing as that's my expertise, I'd like to help:

  • Design an intuitive desktop environment for EyeTap from which all apps can be launched.

  • Create human interface guidelines that can be used by developers to ensure all EyeTap apps have a consistent user experience.

  • Advocate for usability testing as a means to validate the ease-of-use of any application developed for EyeTap. I can also help facilitate usability testing of apps for developers. More on usability testing here: https://en.m.wikipedia.org/wiki/Usability_testing

 

If there are any other UX designers interested in EyeTap, I'd love to collaborate.

 

Any thoughts?

Maaaaax - EyeTap Team
Mar 9, 2018

Hey Hazerfin,

 

It's great to know that you are interested in joining our initiative and are actually interested in leading a part of it! The door is totally open for you to join. I was studying Mechanical and Industrial Engineering when I was in undergrad, and I totally appreciate what I learnt in Human Factor Design, and User Experience Design. I think that is a very important part of a great product.

 

I love the directions you proposed to improve the Open EyeTap, let's plan some more details for it. How are you planning to achieve those things, and what resources do you think you'll need to develop those?

 

Also, feel free to share your AR glass with us. I think there's a huge need among maker/education community for affordable, and it would rely on Open Source initiatives like Open EyeTap to actually make it possible.

 

 

 

Hazerfin
Mar 12, 2018

I'll try to come up with some sketches and post them.

Hazerfin
Mar 16, 2018

Some rough ideation of how the UI could look. Would require a Bluetooth earpiece (mic & speaker), IMU, and GPS.

 

And here's a quick mock-up of how the reticle could work:

 

Probably not how it'll end up, but it's a first stab.

Maaaaax - EyeTap Team
Mar 16, 2018Edited: Mar 16, 2018

Hasan,

 

Thanks for the insightful drawing. I really like the futuristic looking of the UI. One thing that the current Open EyeTap might disappoint you on the UI design aspect is the field of view. The FOV right now is comparable to what Google Glass has (14º), but it's still a very small size if you think about it... Once you get your EyeTap assembled, you will be able to see what I mean. Another startup from our lab, Meta, has a much larger FOV (90º), however, to do that they have to sacrifice the size of the device. It's impossible to wear that device on the street. I think this is a good start, and we can work together to make it fits on the current Open EyeTap better!

 

 

New Posts
  • Cayden Pierce
    Mar 30

    Hello everyone! I decided to take a small break from the VMP project (see other posts) to hack something fun for the OpenEyeTap! I like to run and bike, so I have built a utility with a live speedometer built in. Also in this library is the function to track your location as you run, so you can later view your route (soon I will overlay this on top of a map, maybe Google Maps API?), your speed along the route, and the total distance you traveled. Here's the Github repo: https://github.com/CaydenPierce/OpenEyeTapJogWear Give it a pull and try it out. This is great for beginners as it is simple and was made in an hour or two. Future features to add would be live directions (Google Maps API, again) and a live update counter of distance traveled. This is a simple feature, but imagine the possibilities when a community of users start building features of all kinds, then we will have functionalities for all types of activities. Cayden Pierce
  • Cayden Pierce
    Nov 7

    Steve Mann told me that a great use of the OpenEyeTap would be a device to help those afflicted with Prosopagnosia (an inability to recognize faces) to be able to recognize people. This is a first step in the direction of the "Virtual Memory Assistant" that I have discussed previously. Here it is! https://github.com/CaydenPierce/OpenEyeTapVirtualMemory Follow install instructions to get things working. Let me know if there are any issues. I will put up an SD image shortly to make things easier (building the dlib C++ lib takes a full day haha).
  • Cayden Pierce
    Nov 6, 2018

    This is a continutation of my "Virtual Memory Assistant" post in the "Materials and Design" forum. From my experience and my studies in psychology, it seems evident to me that a central theme in the human mind is other people. That is, while you may not remember the time, date, weather, or your mood when an event took place, you will undoubtedly remember the people you were with. Thus, to create a virtual memory assistant, I have begun with people. Using the Python face_recognition library ( https://github.com/ageitgey/face_recognition ), I have begun developing the capabilites to store a database of human faces that can be recognized during use. I do not have access to public face datasets (something I'm sure will very soon become widely available), and becuase data laws state that I may not take profile pictures from social media. Therefore, I am implementing the ability for the EyeTap to learn who you know as it is used. This works by saving a still image of everyone you communicate with during the day. Later, the user tags all new images with the names of the friends and coworkers whom they were interacting with. Thereafter, the EyeTap will always recognize this individual immediatly. Everything I have described, I have already developed. It is not much, but I have not been working on this very long. I will upload to github and will post the repo for this project soon. Once human beings are able to be recognized, the EyeTap now has a framework for memory classification. Imagine accessing a database of every single conversation you've ever had with a person? (looking into conversation transcription using Google API's). Imagine, at the end of each day, being presented with an automatically summarized description of the days events and important points in conversation. Imagine the EyeTap reminding you briefly of small facts and conversation tidbits your biological brain has forgotte, WHILST in conversation. This is truly a development that be an expansion of the human mind, exaclty as Raymond Kurzweil, and many others, describe as the future of human symbiosis with technology.