BlindSquare has been out for just few weeks and it’s already been used in all continents of the world. I get lot of feedback about how it is helping blind people around the world.
Most of the feedback is positive but I think we have still lot work to be done in an area of combining user needs, technology and social data. That’s why I love to receive suggestions how to make BlindSquare even better.
For feedback you can use Feedback button in app itself or Twitter. Just post a tweet with a hashtag #blindsquare with your thoughts. Some of the questions I answer by email or replying the tweet. If answer turns to be lengthy I will make a blog posting referring to question. And that’s what I’m going to do right now.
@fredshead: Has anyone used the blindsquare app on their iPhone? What’s your opinion?
@davetaylor2112: @fredshead I like it, a lot of potential, the more foursquare integration it gets the better
@ilkkapirttimaa (this is me): More 4sq is coming. At least these: Shouting, scores, tips, menus, phone number/calling.. What else?
@davetaylor2112: Let me play a little and think, all sounds good though, great app
@davetaylor2112 First thing is to be able to review what is spoken using VO throughout app please
To understand the question fully I need to explain some aspects of the application.
Basically BlindSquare has two modes:
1) Active mode: You can use application via iOS user interface: Browse categories, search, check in to foursquare venue etc. You can do this via iOS’s VoiceOver (VO) function where devices internal speech synthesis is in use. It basically reads aloud what’s in screen (see demo: http://www.youtube.com/watch?v=WxQ2qKShvmc)
2) Passive mode: When you are walking in the street, BlindSquare starts to talk you automatically what’s around you. Using VoiceOver for this kind of rich speech flow is not possible, that’s why I use additional text-to-speech synthesis from Acapela (http://www.acapela-group.com/).
When using app in Mode 1, you have full control over every aspect. When using app in Mode 2, you’re more or less a listener. BlindSquare just helps you to make sense what’s around you. To have more control in mode 2 you can use iControlPad bluetooth controller attached to your stick or dogs harness. WIth that you can start to track a place when you hear it mentioned. You can also make a search, search by category, changes radius, look around etc. without even touching your iPhone. It also enables you to use iPad on a street providing whole day battery life. Even with that, if you miss something that BlindSquare reported, there is no way to go back in time.
Should we add a new feature to BlindSquare: Speech History. When activated you would get a list of all places mentioned, latest first. You could go list through with VoiceOver (VO) or bluetooth controller. From the list you could start tracking, check in, get address, call phone number etc.
Is this something that would solve this issue?