I've read a lot of very critical reviews of Pixel Buds, but I'm not sure this rises to the level of "Silicon Valley Arrogance."
But you know what does?
- iPod headphones with wires that get brittle in the cold, so they pop out of your ears during a Chicago winter.
- iPhones that go into emergency thermal shutdown from being in your pocket on a spring/summer day in Arizona.
- An iMessage applet ecosystem that assumes that every iPhone user lives his life like a 20-year-old San Francisco metrosexual. (Launch day was Uber, OpenTable, and bill splitting. Yay.)
How about Silicon Valley offer some apps and accessories for the 99.99999% of the world that doesn't live in your climate? Or doesn't live your lifestyle? Stop patting yourselves on the back for staring at your own bellybutton.
(This isn't meant to be an Apple-specific rant; that's just the ecosystem I'm most familiar with.)
I live in Arizona and I can tell you that iPhones don't go into thermal shutdown while in your pocket. They will go into thermal shutdown if you leave them in the sun while you're outside working on a car on a >105* or so day, and even then they will start working once you get them out of the sun for a couple minutes.
I've had my iPhone go into thermal shutdown in the Bay Area, so this is not SV arrogance, it's a physics problem. When I lived in colder climates with snow I never had a problem with their headphones falling out, or even using the capacitive touch screen (I just took my gloves off).
It sounds like you need the ToughBook equivalent of a smartphone, which is fine, but most users are fine working within the limitations.
I shouldn't need a ToughBook to commute on mass transit in a major city.
I don't think it's "most users are fine working within the limitations." It's "most users don't have a choice but to work within Silicon Valley's imposed limitations."
Weirdly enough, I feel like Google Maps and anything Yelp-like has the opposite problem in New York. When I search for something nearby, "two miles away" isn't remotely relevant when it's across the Hudson, in another state.
They tried to build a single system that works for everyone, and it works kind of okay unless you're on the edge of a bell curve, geographically speaking.
Conversely Google Maps trying to reduce their marker density produces terrible results in the Scottish Highlands - I know there are markers there, but at reasonable zoom levels the map is just blank.
Maybe I'm just an old fart, but "usability" of apps seems to be going down everywhere. For example, making "near" user-definable would be an easy fix, and I don't think it would cause any implementation problems on the mapping side, would it?
I notice feature after feature like this these days, where something has been implemented, but it isn't useful due to lack of attention to detail.
It doesn't look downvoted to me. And I don't feel like it's an entitlement issue - eight million people live in NYC. That's more than the whole state of Arizona.
It's not unreasonable to expect a map to behave differently in the middle of a major city than a more rural area.
And in Atlanta I experience the opposite problem. Google Maps will limit searches to things close by, suggest alternates instead of exact matches, and routinely fails to find locations that I know exist forcing me to manually select them on the map.
Unlike NY, in Atlanta everything is a 30 minute drive so convenience is less about distance.
I hadn't put it into words before, but the problem you describe hits home. I want a 20-30 minute radius of search results, not things limited to the nearest mile. That's worthless in Atlanta.
I moved to ATL (midtown) in '07 and a car was a basic necessity.
But today? I think I could easily live in the midtown area and not need my car. I mean, I'd walk to Whole Foods from 10th and Piedmont regularly (there's a shortcut at the TJs ;P)
If you adjust zoom you're given the option to search that area but it doesn't do it by default and by default it zooms into a narrow radius around your current location.
Tangent: I used to walk on the weekends from my apartment on 123 & Lex, to the 125 street A, take it up to 175th, walk a block over, get on the GW, walk across, then hike down to Hoboken, PATH back to Manhattan, and walk back up town until my legs gave out. Depending on where I stopped , I would either subway or lyft back to my apartment.
It allowed me to experience NJ which I would never have done. And I found places to get my hair cut for < $30, which was more than half off from my local barber, and clothes were tax free, which was TONS better than tax free under $100, etc. I learned so much traveling into NJ regularly, and eventually moved there.
You only had to walk 5-10 blocks south into Spanish Harlem to get a ~13 dollar haircut but totally agree on the taxes for purchases.
I always found NYC specific travel funny. For example, it can take an hour an a half to get from the Upper West Side to Brooklyn but a bus ride from NYC to somewhere like Philly is often under 2 hours.
If you'd kept walking north from 125th, you could go to Hamilton Heights where I get a haircut for twenty bucks. That's ten bucks left over for a beer or two across the street :)
For some reason, Google maps tends to be horrible at getting subway transit right:
They frequently list delays when there are none and don't list them when the subway is running very behind. I wonder if it's just hard to get this kind of real-time data from MTA.
It depends on the line. Numbered ones have more accurate data, for example, as they rely on rail sensors to track trains. MTA recently pushed new Bluetooth-based technology to get location estimates on all the other lines. They're just estimates, still.
It's really an MTA problem, though. You can tell because they now have screens with arrival times everywhere. Even if the screen says that a E train is coming in 11 minutes, one moment later the same train (another E train?) is just two minutes away, if not approaching the platform. Perhaps they only need to recalibrate the code that comes up with the estimates, since the beacon system is fairly recent.
The problem is the subways misuse their delay systems. A few months ago BART was using their delay feed to warn about a shutdown that would occur over a long 3-day weekend. Sure it's a "delay", but it's not an active delay that impacts riders now. Google Maps doesn't seem smart enough to figure that out. But honestly the operators should find better ways to alert riders about upcoming planned delays.
>" An iMessage applet ecosystem that assumes that every iPhone user lives his life like a 20-year-old San Francisco metrosexual. (Launch day was Uber, OpenTable, and bill splitting. Yay.)"
Can you elaborate on this? What do you mean by "An iMessage applet ecosystem"? What Launch day are you referring to?
My iPhone will go into thermal shutdown on a hot New England July day. I got a waterproof case for it and throw it in the water until it cools off. Works great.
Maybe because SV, at least from my outside SV vantage point, projects itself as working on life changing, world altering, every day simplifying products. From their ignorance of life outside SV, their self image is arrogant.
It is kind of a UX curiosity. When people don't speak the same language, they are trained to have a third party "translator" to defer to. You speak, I watch you. Then I watch the translator. Then I speak, and so on.
That translator could be a person, a phone running Translate, a notepad you're drawing pictures on, whatever. The point is, you have something physical acting as the intermediary and so it's natural to direct your attention to it.
With the Pixel Buds, they're taking that physical manifestation away but the problem is under the hood nothing has changed. So you're still having this fragmented three-way conversation, but now there is no physical intermediary that makes it feel natural.
When you're trained to work with a translator, you're actually trained to not watch them - you keep eye contact with the other person the whole time and just listen to the translator, and it isn't a three-way conversation.
Good point, and that's kind of where I'm coming from--when I said "people are trained" I meant implicitly, for regular people who would only encounter this rarely. So is this similarly just a question of user training to break from the "natural" interaction?
but you still both know when the translator is speaking, and so to wait for the translation to finish, and can register facial expressions to gauge the effectiveness of the translated message in communicating what you are trying to say
"It is small contribution to the vast corpus of complaints about what happens to product design when an engineer’s focus on problem solving blinds them to the norms of social interaction."
Product design is part of engineering. Good engineering entails solving the actual problem within the actual constraints. In this case, ignoring the constraints of social interaction is bad engineering.
A very interesting observation about the shared public experience of both talking into a phone, vs. the semi-private experience of receiving one half of the conversation in your 'Buds.
This echoes at least part of what made talking to people wearing Google Glass so odd - you didn't know what they could see, or where their mind was. In the same way having a conversation with someone while you glance at your phone is rude – because it creates an unevenness of attention – any interaction where you have a private component publicly and visibly involved is going to be awkward.
This is a really good observation. I ponder a lot on UX of new products and have to accept I didn’t think of this before. But I do expect Google to have realised this problem in their user testing (if it involved non techies at all). Pixel buds feels like a product rushed to market (because they had to remove the headphone jack?) and not well thought out/implemented.
Did anyone really thing Pixel Buds were going to be the must-have headphones for everyone? It was a cool example of applied technology that grabbed a bunch of headlines and gave people something to think about (what if I could talk to everyone regardless of language?)
The slides in the office are there to get press; the earphones that are clearly not going to be a huge seller are there to get press.
Can someone explain what the 'secret sauce' is in Pixel Buds that makes them unique? If the phone/internet is doing the translating why cant any earbuds/heaphones with a mic connected to the same phone with the same software do the same thing?
Maybe they do something special with noise cancelling? I think it's mostly "We want to sell these, so we're only activating this feature if you use them."
I think the bigger problem with both is that they're tech demos rather than products.
I got to demo Google Glass at a tech meetup once. It wasn't the first release, so there had been time to work bugs out. It rapidly overheated after taking, and looking at a couple pictures, and the battery life was abysmal. It demonstrated several capabilities that will probably be included in useful products in the future, but it was essentially unusable.
I bet in a decade or so, smart glasses will take off. As soon as they're actually useful, the social resistance to them will get worn down.
> It is telling that this product comes from the same people who brought us Google Glass, an ugly, invasive face-mounted camera that evoked hostility wherever it was worn.
I think what would be helpful is some type of visual indicator on the buds (maybe a pulsing charging indicator) to express that the translation service is speaking so that the other party can see that you’re occupied with listening to the translation.
What, excluding the google doodles, appear to be made for preschoolers, and why?
Apple's (industrial) design is, if I used just two words, flat and alien. Apple's UIs and product logos are more colorful, but their physical products are almost without fail white or brushed aluminum, and foreboding.
Google uses color in its software, but if you take a look at google's physical products ("Google Home", "Google Pixel phone", etc.), you'll note that they're less colorful than their logos and software.
If you look at the design guidelines you see a stark difference in the starting point.
Apple's [1] guidelines are basically "how to design your stuff to fit well with Apple". Google [2] on the other hand tried to create a visual language that was generic and understandable by people of all technical levels (with a focus on the "next billion users," people who are just getting their first computing device), and then open sourced it.
From the article: "It is worse that Germans possess an inherent distrust of Silicon Valley firms so asking them to speak into a phone while you’re wearing earphones is an invitation for abuse."
Is this true? I've never been to Germany and don't know many Germans, but this seems surprising to me.
I wouldn't be surprised: a lot of Germans used to be East Germans, and the Stasi were pretty thorough in their surveillance. EDIT: Watch "The Lives of Others" when you get a chance. It's a great movie about relating to the subject.
If you don't like it, rightly criticize it (with the caveat that it's your opinion or state facts) and don't use it. I don't understand the silicon valley bashing; if you are so smart why don't you build something better?
Google doesn't do voice to text on the phone, they use an online service. It would take a ginormous corpus of trained data and cpu to quickly process most of the ways people pronounce even one language.
Their conversational translation is voice based because typing back and forth on the phone takes too long. It works wonderfully when there is internet, but you're screwed if you're traveling without a data plan. Then you're forced to "learn a language" or "use hand signals" or "write things down" like some 20th century chump.
This seems like a bit of an over reaction. Google has to talk up Pixel Buds, they are a couple months behind apple's ear pods.
Frankly I like the form factor better than the ear pods, because they actually look like headphones. I'm not sure the gestures are all that big of a deal, give it a couple more months and the software will become finer tuned and people will adjust.
These will be really useful for group conversations. I'd still use the phone for one-on-one interaction, but ear buds make perfect sense if you are mostly following along and only speaking occasionally.
I really need someone to design a stand-alone pixelbud for hearing aids, speak translation. I really need it, and I'm willing to invest in anyone doing that.
Well, probably they won't change the world. But the author's complaints sound like only that: complaining. Specifically, complaining in the vein of:
but it's not PERFECT!
or
I prefer something else ::pouty face::
All of this from their own singular experience, where we are supposed to take their initial example, using a phone to translate with a taxi driver in China, as vastly superior. But we have only the author's POV for that: the taxi driver may have thought it just as ridiculous as the author feels the buds to be. Or maybe the driver loved it too, but that's the point: we don't know, and neither does the author, but the author is generalizing from these limited experiences to everything, and we're supposed to go along for the ride.
This doesn't even get to the specifics: The crux is that the author dislikes that the translation is in-ear. There's other minor complaints, but that's the heart of his rant. A bit overblown: Translation at the UN itself is "in ear", where the members decide which of six official languages to listen to on their headphones. Not that we should hold up any single example as the ideal: We shouldn't. Different mechanisms may work best in different circumstances. That's the problem with this take-down of buds: It is sweeping and generalized, and supremely self-centered in its outlook for what all & everyone else should appreciate.
But you know what does?
- iPod headphones with wires that get brittle in the cold, so they pop out of your ears during a Chicago winter. - iPhones that go into emergency thermal shutdown from being in your pocket on a spring/summer day in Arizona. - An iMessage applet ecosystem that assumes that every iPhone user lives his life like a 20-year-old San Francisco metrosexual. (Launch day was Uber, OpenTable, and bill splitting. Yay.)
How about Silicon Valley offer some apps and accessories for the 99.99999% of the world that doesn't live in your climate? Or doesn't live your lifestyle? Stop patting yourselves on the back for staring at your own bellybutton.
(This isn't meant to be an Apple-specific rant; that's just the ecosystem I'm most familiar with.)