I have a MicroSDCard inside a SDCard adpater sitting flush in my smaller-than-iPad netbook. I got it for "free" with purchase of a cell phone, before the iPhone existed. I guess I need to get a MacBook to reproduce this unsightly protruding SDCard problem so I can then get a "niftyminidrive" to solve it? According to the title, it's a"hack". Would that make me a "hacker"?
This is almost as good as Microsoft's "ReadyBoost". Otherwise known as a USB slot.
It's a $5 USB stick used as a swap file. No, it's a "Windows Memory Expander". And the price is $49.95. Oh.
Solution: Let these interface gimmicks be the stuff of tech conventions and demos, and never part of any mass-produced product.
But maybe some marketing folks, seeing the success of Apple, have reasoned that these gimmicks can sell products in the short-term, at least until consumers discover the problems with them.
At the end of this era of tapping and rubbing little screens with our fat, dirty fingers, we may be reminded why we had tactile interfaces to begin with. What ever happened to the PDA stylus?
I think this is important, and the other important factor is that removing the stylus and reattaching it adds at least 3-4 seconds to an interaction with one's phone. My interactions with my phone today are often under 10 seconds which means that a stylus would add 30-40% friction to those interactions.
The market despised the stylus. I tried setting up a wireless office using tablets. No one would use the stylus and actually tried using their fingers.
Does he need money? One would think he probably has a pretty good day job already. But if he really needed it in order to continue writing software, I would bet many people would be willing to contribute to a "Bellard Writing Fund". His contributions to open source are really in a class by themselves.
Individual persons can't deploy their own LTE base stations, unless they've got a couple million in their pocket for the radio frequency licenses. The only way this will be available to the consumer is if a telco produces it.
It's possible that innovation, specifically scientific progress that leads to innovation, actually relies on scientists sharing stuff with collaborators that patent lawyers would not want them to share. The patent lawyers might often be "the last to know". It's possible that many scientists often put their own scientific goals, which stand to enrich the scientific community as a whole, before the business goals of the employers they work for. Something tells me that Apple's patent lawyers do not have this problem with Apple engineers.
Not that I consider software developers to be scientists but I wonder if there are any Apple engineers contributing to open source projects or even just publishing the occasional research paper.
Doesn't Apple have rules that keep all their employees from talking to anyone outside of Apple under threat of immediate termination?
No FDA for software. You can say that again. There's also no Medicaid, Medicare or health insurance that pays for it.
The iPaq was/is a fantastic device. You could attach peripherals. Try doing that with an iPad. I could put an entire OS on a CF Card and assuming I could boot from the card, the expanded functionality is limited only by the hardware specs. They are durable. I've seen consumer electronics businesses still using them to track inventory.
I wish HP would revive the iPaq.
My only imagined use for an iPad is as a portable display. I want the Retina quality, but I have a more powerful hardware to attach and I need a real keyboard. There's nothing the iPad can do that my open, unlocked hardware cannot do.
I believe it was even possible to attach a real keyboard to an iPaq. That's the kind of flexibility I want. I can get data into and out of the device in any number of ways, without hassle.
I understand the comment. Of course the thought has crossed my mind. But I'm not so sure there's any evidence to support it.
I actually test some of my ideas with people like your mother, and surprisingly (why should I be surprised?) they have little trouble catching on.
What's really amusing is that these things that I have them doing are things that many nerds cannot themsleves do. I've got them using systems and techniques that many nerds won't touch because they think it's too "hard core". It's hilarious.
There are lots and lots of unfounded assumptions about what users can and cannot do.
There are facts, supported by evidence. And then there are assumptions. One requires a bit of work. The other is effortless: you just hit "Submit".
I understand the comment. Of course the thought has crossed my mind. But I'm not so sure there's any evidence to support it.
Your post is vague enough that I have a hard time parsing it. No evidence for what? That there is such a thing as a "non techy" user? The evidence is overwhelming, including anecdotal evidence from practically every "techy" person here who have ever had to help their family/friends with a computer issue.
And it's not a question of ability, it's a question of ability + caring enough. The non-tech people might not care to boot an OS from a CF Card -- certainly not enough to seek out how to learn it.
To mom-type:
Here's an iPad, you can use it to easily email your friends, check facebook, and check the web.
or...
Here's an iPaq 2012, you can do all the above, but it's a little harder to use, HOWEVER you can boot any linux distribution you want, add a USB device, attach ANY keyboard -- like one with mechanical switches! I personally prefer the Cherry Blues, but you might want to try the Topre ones. People who use those never go back. They cost a bit more ($~250), and you have to get it imported from Asia or find a U.S. distributor... but it's worth it.
anecdotal evidence indeed. that's precisely the point. there is no empirical evidence behind the vast majority of comments like these. conclusions without evidence or any indication of the methods used to arrive at them.
linux? are you kidding? this is exactly what i was referring to: assumptions. how did you conclude by booting an os i meant linux?
Actually that's not anecdotal evidence. Anecdotal evidence looks like this:
"It has been my experience that XXX"
The parent has made a different formulation, specifically:
"Every person belonging to group Y has had experience XXX"
This difference is significant, because the argument is basically stating that it is not only the experience of the person making the argument, but that the person making the argument is expecting that the readers of the argument are going to be able to confirm the experience for themselves. This is a much stronger argument than mere anecdote.
And for those that are already starting to lean on their keyboards to type "the plural of anecdote is not data", that platitude is a recognition that data is supposed to repose on a generalisable sample of reality, and if you are just going on anecdote, even multiple anecdotes, you are leaving yourself wide open to claims of cherry-picking. But this claim does not cherry pick, it says that a vast majority of "techy" peoply should be able to confirm the claim from their own experience.
the original comment referred to a market for "people like [me]" being "not large". my response was that i have not seen any evidence to support that sort of claim. but i'm not even sure i know what he meant by "people like [me]". i had to assume i knew. the problem with assumptions is they can be wrong.
and i'm not sure understand the reference to "techy people". i never mentioned such a group. i mentioned "people like the [commenter's] mom". presumably (another assumption), she's not a "techy person", whatever that is. but maybe i'm not a "techy person" either. what is the definition of "techy person" anyway? would the definition differ based on the person defining it? maybe i see no distinction between "techy" and "not techy". maybe i only see differences in how much a given person understands about what computers can do, and how to make computers do those things.
that's precisely the point. there is no empirical evidence behind the vast majority of comments like these.
Are you being purposefully vague? No empirical evidence of what? I'll counter that there is overwhelming evidence, but I'll share specifics once I know what your claim is. :)
no empirical evidence to support the original statement he made: "the market for people like [me] is not very large"
for one, what does "people like me" mean? people who can make use of non-apple hardware? what sort of uses? i don't know what he meant. i could take a guess. but then i would be making an _assumption_. and i might be dead wrong.
and that's what you did in your comment. you made some assumptions. what were they?
i already told you one: you assumed the os a "mom-type" would run would be various linux distros. what if it's not linux?
here's my guess: we're debating whether mdonahue's statement "people like you are not a large market, unfortunately" is true.
however as i pointed out, we haven't agreed on what "people like me" means. we cannot debate this statement until we have agreed on a definition for that. then we have to consider what is meant by a "large market". what is a "large market"? then we have to decide whether this what is assrted in his statement, if true, is "unfortunate" or not. or maybe we can skip that since it seems like just a mdonahue opinion.
is this explanation still too vague for you? i'm not sure how much more specific i can get.
Did they fully understand the reasoning behind the techniques you taught, or was it just memorization?
I have taught my mom how to do certain things, but she has no intuition. As soon as something is slightly wrong or different, she gets stuck and can't move on. The solution is always something simple, like relaunching the app, installing an update, power cycling the computer, jiggling the usb cord, modifying the permissions on a file... but there are only so many contingency plans I can teach her.
Maybe I just suck at teaching, but I think that technical people have an incredibly curiosity and comfort with troubleshooting that people like my mom don't. We are basically playing on our computers.
With the iPad, my mom is finally playing too. She is really adept at it. Sending photos, checking facebook, downloading new apps, she was never comfortable doing any of this on the computer. Too many choices and settings and things to potentially screw up that she would be paralyzed, unable to explore and try things.
Ultimately I think the home button is the most important thing in the iOS ecosystem. If all else fails, go home and everything will be fine*.
(Unless your battery dies, or there is lint in the charging port, or your screen shatters, or you muted it, or you turned on airplane mode... it's not perfect...)
I think reasoning, in addition to basic instructions, is important even though some people might not care about it.
By leaving it out you deny those who do care an opportunity to learn.
And to me it just seems more respectable when someone asks you to do something and tells you why you are doing it then if they just give you bare instructions. (That said, the bare instructions should be able tostand on their own. They had better work, every time.)
Moreover, providing reasoning forces you to demonstrate you know the subject matter well enough to be able to explain it.
Yes, but perhaps _making things easy to use_ and _sexy hardware design_ might have something to do with their success?
The problem with the research labs at those example companies you cited is that those businesses have little incentive to introduce innovation that would compete with "yesterday's ideas" that are driving their profits.
This is why Bell Labs was so unique. They could basically do whatever they wanted (you might try to make the same claim with your example companies, perhaps) _but_ ... they also managed to release these ideas into the market. And not always to the satisfaction of AT&T. People once had to pay for UNIX. Not anymore.
Xerox PARC is another well-known case where people were "set free" to work on whatever they wanted. But their ideas did not manage to trickle out to the market very well. Instead, Microsoft got one of their key people, Excel was born and the rest is history.
Apple is _not_ an idea factory. If someone called them two-timing thieves and told us to watch our backs, I would be inclined to take it seriously. (The fact that Apple is not the idea factory is why the lawsuits are so offensive to anyone who knows anything about the history of computers. If these sort of broad patents should go to anyone, it should be people like the ones who worked at Bell Labs and Xerox PARC. But maybe patents were not their priority. Maybe they were more interested in research, or playing computer games, than money. [How many UNIX patents? 1?] Go figure.)
But, Apple is a design house. An within IT, they do not have lots of competition in that area: e.g. design of hardware casings. In addition they go to great lengths to make the great ideas (namely the flexibility and stability of UNIX-like systems) easy to use. Another area that is lacking in IT: making the good stuff (like UNIX) easy to use.
Unfortunately Apple feels the need to abuse the patent system to stay on top. It makes me think if they didn't they might be in for a big fall. Maybe they are surprised at their own success? And nervous about losing the top spot?
Incidentally you could argue IBM started all this software patent nonsense. Not sure many programmers would agree with you, but the number of filings and issued patents by IBM, most of them before Microsoft even had a patent department, tells the story quite clearly.
You are not going to see much innovation released from "research labs" at the likes of Microsoft or those other companies. They will not keep their patent department in the dark. Those guys want to keep their jobs, not take risks. "Microsoft Research" or "Google Labs" are not Bell Labs or Xerox PARC. It's a wonder that something like Kinect was even made into a product. And you could see how nervous they were about it.
Today, the "labs" and the idea factory is the world wild web.
The funny part, considering how everyone complains about not really getting to "own" their device, is bringing up AT&T as a good guy. The company who literally would not let anyone own a telephone.
They owned the network. And they wanted to control devices that could be used on it. (There may have once been legitimate reasons for this.)
Apple wants to control your devices. How you use them after your purchase. The network you use to obtain content. And even the content you download: you don't own it, they license it to you. There have never been any legtimate reasons for all this and there never will be.
"Wow, this just what we've been looking for! Where have you guys been all these years?"
The sad thing is it's not what we think that matters. If clients fall for the buzzspeak, it's irrelevant if that what they're paying for, believing it's "new" and "different", already exists.
Just based on the Linus quotes, I think he's right.
The kernel did what it was supposed to do back in 1991.
What is the "desktop" supposed to do?
There will never be unanimous agreement on that - every user will have different needs and preferences - and so desktop developers live in constant denial, believing they are the only ones whose preferences are relevant - i.e. they know what's best for users - and blaming others for their own failings.
I'm not a Linux fan, but I give Torvalds' +1 for his response (and knowing the value of "not breaking stuff").
I have a MicroSDCard inside a SDCard adpater sitting flush in my smaller-than-iPad netbook. I got it for "free" with purchase of a cell phone, before the iPhone existed. I guess I need to get a MacBook to reproduce this unsightly protruding SDCard problem so I can then get a "niftyminidrive" to solve it? According to the title, it's a"hack". Would that make me a "hacker"?
This is almost as good as Microsoft's "ReadyBoost". Otherwise known as a USB slot.
It's a $5 USB stick used as a swap file. No, it's a "Windows Memory Expander". And the price is $49.95. Oh.