This is why I don't understand why so many hackers these days like to use apple products.
Apple is the antithesis of the hacker ideal. They're just as bad as Microsoft.
I mean, seriously. Have you ever been to a radioshack? Multi-charging devices are a common product. Yet apple will have none of it. It's clearly an anti-competitive measure aimed at making sure they're your only supplier.
Furthermore, apple's chargers suck. They deliberately have a weaker rubber sleeve around the end of laptop charger cables because it looks aesthetically nice. It's been proven that it's weaker than the conventional rubber joints on most laptop chargers, but they don't change it, because they value aesthetics over functionality.
This is why I don't understand why so many hackers these days like to use apple products.
Avoid if possible.
That's the thing. I can't. Because I want a computer I don't hate. There's nobody else who makes a machine I want to spend 8+ hours a day working and playing on.
There are some PC manufacturers where I can get some of what I want, but I've yet to see one where I can get all of it--I like Lenovo's build quality and aesthetic, for example, but their laptops are universally underspecced for what I want (if your only GPU is Intel, you are not getting my money) and I can't get a Retina display, which I now consider mandatory, anywhere else. And, while we're at it, if you're significantly heavier than my rMBP and don't provide at least competitive battery life, you're out too. I carry around enough crap as it is.
Worse, there are no operating systems other than OS X that fulfill my needs of a pleasant-to-use Unix/Linux system--certainly no Linux distribution comes close on the "pleasant-to-use" part; I'd rather use Windows 7 and Cygwin than any Linux desktop I've been subjected to in the last five years.
For me it's the same as it was with iOS--until Android 4.0 there was simply no worthy competition to iOS as far as I was concerned, but 4.x is fantastic and I switched over because it gave me the environment I don't hate plus the ability to mess around and customize it to my liking. But the first part is more important. If there was a Linux distribution with Apple levels of attention to detail and a modicum of taste--and that doesn't mean "looks like OS X", something different could be fine so long as it was designed for human beings instead of neckbeards and was uncompromising in its attention to detail--I'd probably be there. There isn't (and very well may never be), so I'm not.
> Worse, there are no operating systems other than OS X that fulfill my needs of a pleasant-to-use Unix/Linux system--certainly no Linux distribution comes close on the "pleasant-to-use" part; I'd rather use Windows 7 and Cygwin than any Linux desktop I've been subjected to in the last five years.
Sounds to me like this is just a matter of what you're used to. From my experience, OS X's UI is the last thing anyone would want if they weren't used to it. The window management is pretty damn horrible. OTOH, Linux has a variety of window managers with many innovative paradigms.
> I like Lenovo's build quality and aesthetic, for example, but their laptops are universally underspecced for what I want (if your only GPU is Intel, you are not getting my money) and I can't get a Retina display, which I now consider mandatory, anywhere else.
I don't know what you're doing with your machine that makes Lenovo machines underspecced, but for web and Android development, the ThinkPad X1 Carbon fits the bill. I wrote about it here: http://news.ycombinator.com/item?id=4848375
If you want discrete graphics, that is available in other models, such as the T series, IIRC. However, for anything other than gaming or video editing, integrated graphics really is enough. I personally prefer not having discrete graphics, as it keeps me from playing games.
The only thing missing is retina support, but you have the advantage of a lighter and thinner laptop than the MBP. It's no different from the MBA in that respect, with the added advantage of a 14" screen in the same body size as the 13" Air.
Sounds to me like this is just a matter of what you're used to. From my experience, OS X's UI is the last thing anyone would want if they weren't used to it. The window management is pretty damn horrible. OTOH, Linux has a variety of window managers with many innovative paradigms.
Your experience does not match mine but thank you for trivializing mine by assuming that it's just what I'm used to. Never mind that I come from a Windows and Linux background, to OS X about three years ago, right?
I find the OS X window model appealing (though not without its warts) and its gesture support fantastic. Mission Control maps far better to the way that I think about multiple desktops than GNOME's or KDE's--and I compare to those and not your "innovative window managers" because I've been there, I've done that, and I have determined that they're not for me. I have been down the tiling window manager road and I find it demanding of micromanagement; it does not map to how I think or work and I have no interest in contorting to fit it. Likewise, I have no interest in putting up with sharp bits and pointy edges in Unity or GNOME 3 or KDE 4 (I've been there, too, and reject the T-shirt).
If you want discrete graphics, that is available in other models, such as the T series, IIRC. However, for anything other than gaming or video editing, integrated graphics really is enough.
The rMBP is not my only computer, but, yes, I want to be able to play games on it when I'm traveling. The rMBP is really good at it (I generally just play through Parallels, I don't bother to reboot). The T-series Lenovos are massive, heavy, lack Retina displays, and aren't nearly as aesthetically pleasing as the Carbon X1 is. Would never buy.
The only thing missing is retina support, but you have the advantage of a lighter and thinner laptop than the MBP.
"Except for the most important thing, you get some of the things." Sure, I'm being flip there, but it's not unserious: for my money, retina support is just not optional anymore. People say they can't go back after using a retina display because it's true. For me, the difference is that stark. The 3x 21" 1080p panels on my desktop feel distractingly blurry and they make me irritable to work on for long periods of time. (Most disappointing thing to me is that the DisplayPort outputs on the rMBP can't output 4K, but 2560x1600 displays look okay when put far enough back on my desk.)
> Never mind that I come from a Windows and Linux background, to OS X about three years ago, right?
I think that a lot of people are willing to put up with the transition to OS X because of the RDF and the iPod/iPhone halo effect, but the transition away has no such phenomenon to motivate people to use a platform long enough to get used to the differences. However, this is just a general observation.
> I have no interest in putting up with sharp bits and pointy edges in Unity or GNOME 3 or KDE 4
Meh, I find that the "sharp bits" are no more sharp than those in OS X. No platform can be perfect for any given user, since we all have different preferences. The closest you can get is with a highly-customizable tiling window manager, but anything like OS X, Windows or KDE wouldn't work like this.
> The T-series Lenovos are massive, heavy, lack Retina displays, and aren't nearly as aesthetically pleasing as the Carbon X1 is. Would never buy.
You clearly are not up-to-date on this. Have you seen the T430u? It weighs less than the rMBP 15", has discrete graphics, and has similar aesthetic stylings to the X1 Carbon. The optical drive has been omitted, for example.
> for my money, retina support is just not optional anymore
FWIW, I've found it to be half-baked when a lot of software and most websites don't support it yet. Browsing the web on a retina display is rather jarring. Moreover, when I regularly use external displays at my desk, switching back and forth would be quite jarring as well.
> Browsing the web on a retina display is rather jarring.
I would like to preface this by saying that I couldn't give a rat's ass about the rest of the arguments but in this case I just have to chime in since the quoted bit is just bullshit. The only difference between browsing the web on a retina display and browsing it on a normal screen is that, on a retina display, the font rendering is a million times better. That's it. But that is enough to make me not want to look at any other type of display again.
It is true that some applications do not have retina support (I'm looking at you, Firefox) but that support will come with time and all of the applications that I use on a daily basis look great.
Firefox 18 beta supports retina finally. Think its January release date or so.
Also 1000% this, high dpi, retina, whatever you want to call it rendering is so much better than non I've asked my bosses if I can buy a macbook pro to use for work. I work with text all day, this screen is so markedly better for text its not funny. Jarring? Not in the least, Jarring is using other displays after this.
Call me a fanboy if you want, but when my eyes don't fatigue from extended reading sessions, I think I'll take the fanboy label.
> I would like to preface this by saying that I couldn't give a rat's ass about the rest of the arguments but in this case I just have to chime in since the quoted bit is just bullshit. The only difference between browsing the web on a retina display and browsing it on a normal screen is that, on a retina display, the font rendering is a million times better.
Actually, this is what's bullshit. Having used a retina MBP, viewing websites without retina images looks very jarring, especially when the browser itself has retina assets.
What about people who transitioned to OS X before any iOS devices? Are they sadists? That's empty rhetoric. You're just projecting your ideas. OSX is way more pleasant to use than any linux distro. The design-by-committee that happens on most distros doesn't give the best results.
There are plenty of TWMs for OSX if you want it: divvy, sizeup, tyler wm, etc.
> Browsing the web on a retina display is rather jarring
It's exactly the same as browsing on a non-retina display. As in physically exactly.
> What about people who transitioned to OS X before any iOS devices? Are they sadists? That's empty rhetoric. You're just projecting your ideas.
Next time, take a second to restrain yourself from hastily typing out a polemic and hitting reply, so that you can reread the post you're replying to. I also mentioned the iPod, which has been around since 2001 and has also played a significant role in the increase in popularity of Macs.
> OSX is way more pleasant to use than any linux distro. The design-by-committee that happens on most distros doesn't give the best results.
From this, it sounds like you're the one projecting your ideas, actually.
> There are plenty of TWMs for OSX if you want it: divvy, sizeup, tyler wm, etc.
Most tiling window managers for OS X don't give you the level of control necessary to make them actually worth it. Sure, some keyboard shortcuts are nice, but that's not the real point.
> It's exactly the same as browsing on a non-retina display. As in physically exactly.
Uh, if it were "physically exactly" the same, then what would be the point of getting a retina display?
The ThinkPad X1 Carbon has a trackpad of quality equivalent to the MacBooks, just look at reviews online. Plus you get the TrackPoint, which is much more efficient once you get past the learning curve.
and then comes the amusing lack of proper configuration on the OS side. I am not an Apple person, but I have yet to see a trackpad- configuration that works even remotely as well as the MBP ones. The speed and acceleration multipliers in there have always been off.
EDIT: +1 on the NavPoint. I disable the trackpad on all my thinkpads and just use the NavPoint whenever I can.
I meant moving to OSX before owning any iOS device. Every developer I can remember right now that uses a Mac bought it because of the nice hardware and software, not because it's cool (they cost 3x+ as much as a PC over here, so it has to be worth it).
On retina: you were saying that a lot of software and websites don't support it; in that case, the image on the screen is exactly the same as a non-retina display, four pixels make one 'standard' pixel.
I had not, actually, and thank you for pointing me at it--I'd seen the T430, but not the u variant. That's much more like it. I dig that. If it could run OS X I'd be all over that. =)
> but it's not unserious: for my money, retina support is just not optional anymore.
It's interesting how one thing makes or breaks the deal. I'm the same :-), however my top feature is keyboard (I type a lot). That's why I'm always sticking with thinkpads. Sadly, I'm screwed for my next upgrade, because Lenovo have announced that they're ditching the traditional keyboard in favor of chiclet version. That goes for their complete thinkpad line.
> Sounds to me like this is just a matter of what you're used to. From my experience, OS X's UI is the last thing anyone would want if they weren't used to it. The window management is pretty damn horrible. OTOH, Linux has a variety of window managers with many innovative paradigms.
No, sounds like you are the one that's held back by what you're used to.
Just like the OP, I'd rather use OS X over Linux and Windows; it's a toss up for 2nd place between some of the great modern Linux distros and Windows 7.
I have used OS X extensively as well, so I have absolutely no idea on what basis you're making that statement. I've used OS X on a variety of devices, from laptops to high powered Mac Pros, and the window management has always felt sluggish to me in comparison to Windows and Linux. Having the menu for all the windows at the top of one monitor is quite simply infuriating, especially when using 2 or 3 monitors.
At least Apple finally fixed not being able to resize windows from any side last year. That still doesn't match Linux window managers though, which let you resize windows with Alt+right click and move them with Alt+left click from anywhere within a window. And nothing comes even close to the control possible with tiling window managers like Xmonad.
The cursor in OS X has also always felt very off - I think it's an issue with the mouse acceleration curve. I have repeatedly tried to fix this with 3rd party utilities, but nothing ever worked.
One of the somewhat frustrating assumptions running through your posts is that everyone materially gives a damn about the "control" of your window managers--as if the window manager is of more bracing importance than the applications running within. For you it may be, but to post as if this is a universal case is a little much.
Personally, I've sampled Xmonad. I don't like it. I found it actively tiring and demanding of micromanagement to work with it. This is in contrast to how, on OS X, I literally don't think about windows at all. I do everything with cmd-tab and swipe gestures on the touchpad. I also don't find resizing windows to be overly difficult and I only rarely resize anything at all. (On Windows, I just use Aero Snap across four monitors, which works pretty nicely for what it is.)
But the applications within, and the general lack of user focus and attention to detail, are really the core of what keep me off of the Linux and BSD desktop.
> One of the somewhat frustrating assumptions running through your posts is that everyone materially gives a damn about the "control" of your window managers--as if the window manager is of more bracing importance than the applications running within. For you it may be, but to post as if this is a universal case is a little much.
Well, we're specifically discussing window management, not the software running within. A comparison of Linux and OS X software is a completely different conversation.
> I found it actively tiring and demanding of micromanagement to work with it.
This is a common complaint, and all I can say to you, in the spirit of Steve Jobs, is that you're "doing it wrong". If you're actively managing windows with a tiling window manager, then that defeats the whole point of using the software. The idea is to have predetermined rules for how your windows will be arranged, and then not rearrange them at all (or rarely).
The whole reason for Xmonad having its config file in Haskell, an actual programming language, and not just a flat file with some settings, is so that you are not limited in the logic by which your windows are managed. Granted, this does require you to learn Haskell, which is probably Xmonad's biggest practical weakness. A similar window manager written in a more popular and approachable language, such as Python, would probably have a lot more users while only sacrificing a few things.
> the general lack of user focus
That's quite ironic, given that this entire thread is the result of Apple ignoring user interests.
> Well, we're specifically discussing window management, not the software running within.
We are? I wasn't intending to. Throughout this thread I've attempted to be clear that I was talking about the desktop environment, rather than the window manager; I apologize if I slipped somewhere. Window managers aren't even really on my radar unless it sucks. Like, GNOME's WM--sure, whatever, it's there. I don't even really think about it. It's what I'm trying to do with it that drives me batshit. I don't even think about OS X's window manager except on the (fairly rare) occasion that I swipe up for Mission Control and throw a couple windows into another desktop. It's just not a thing to me.
The applications inside, on the other hand...welp.
> The idea is to have predetermined rules for how your windows will be arranged, and then not rearrange them at all (or rarely).
That's not how my brain works. I didn't say Xmonad was wrong, I said it doesn't map to how I think and work.
> That's quite ironic, given that this entire thread is the result of Apple ignoring user interests.
As far as I can tell, tiling window managers are the ultimate antithesis of micromanagement. What could you possibly be spending your time doing with it? It's like saying an automatic transmission makes you micromanage.
"This looks horrible. I'm going to move it to better show what I want to see and how I want it to look."
I will sacrifice some sheer productive efficiency to have an environment that looks and feels right. Tiling window managers don't get me there, which is why I found myself endlessly screwing with what it was doing with my desktop until I came to the conclusion that I could just not use it and be happier about it (which I did).
I guess you used a wildly different from what I use. "Moving things around to better show things" is exactly what you don't have to do with a tiling window manager.
> From my experience, OS X's UI is the last thing anyone would want if they weren't used to it.
I switched away from Linux/Windows XP dual booting to a Macbook with OSX after 5+ years. I much prefer OSX to either alternative. I still love Linux for server or embedded applications but the Linux desktop experience continues to frustrate me. Unity has only made it worse.
> OS X's UI is the last thing anyone would want if they
> weren't used to it
I was very used to MS world, living in it (DOS and Windows, from 3.11 to XP) 1990-2006. I got my first iBook G4 in 2006 and never looked back to Win. In 2006-2007 I was using OS X, Win and Linux in parallel. I know pretty well why did I stick with OS X and it has nothing to do with being used to it.
> Worse, there are no operating systems other than OS X that fulfill my needs of a pleasant-to-use Unix/Linux system--certainly no Linux distribution comes close on the "pleasant-to-use" part
There is one. It's currently in beta, but it's a Linux distribution that wrote its own Human Interface Guidelines, wrote its own suite of default applications that conform to that HIG, and is firmly devoted to the desktop/laptop paradigm instead of the touchscreen/tablet up-and-comers. It's elementary. http://elementaryos.org/
I've looked at Elementary before and I applaud them for what they're doing. I like it a hell of a lot more than I like Unity, that's ferdamnsure. That said, it's not there and I tend to think it never will be for reasons that aren't their fault. Yeah, Elementary has written some appealing default applications, and the lessons they're putting down are ones I'm good with picking up. I like the way they're looking at applications like their calendaring app; they show some of what I have referred to in other comments as "give-a-shit". I appreciate that. The problem is that they aren't sufficiently weighty to change the rest of the stuff out there. They don't have the clout to tell the developers of applications like GIMP or Inkscape to conform to a sane HIG. So you're bouncing in and out of the land of decent applications all day. That sucks. It's not really their fault, to be sure, but it still sucks. Apple's control of their environment has done a lot to encourage/browbeat/force developers to act consistently with the platform, and that's a big reason that the ecosystem around OS X is so fantastic to work in.
Also, and this is just sort of a general thing, I don't think I can put up with a GTK+ environment again. It's fugly and gross and the spacing is always so wrong and...ew. Can't do it. Wouldn't be prudent. ;)
Unity may be a sub-par desktop environment at the moment, but at least they tried to do something new and innovative. Elementary just looks like an even more Apple like version of an Apple product.
I have lots of CS friends who say the same thing. However, they never seem to be that aware of the options that are out there. Getting a PC doesn't mean you have to get a Dell or an HP, there are lots of options out there. Lenovo makes some great stuff, but so does Sager/Clevo, MSI, Asus, etc. For what Macs cost you are not very good hardware (with the exception of the new screens).
While we are at it, why do people try and find devices that are as close to a Mac in appearance as they can get? Personally I don't mind having a 3mm thicker laptop if it means I can have a more powerful graphics card, be able to work on it, or have multiple HDDs. Most people who really use their laptop for coding are sitting at a desk using it 90% of the time anyway. And for god sake am I sick of brushed aluminum. I find it ironic that a platform that so many artists use has the most boring appliance like look possible.
Also, what is the obsession with OSX? It's just a more limited version of Linux with a bad window manager. I have a Sager laptop and multiple major Linux distros I have tried all work flawlessly out of the box, even with discreet graphics. I still use Windows 7 most of the time though, but that's just because its the best OS available (at least for me).
Wow that came off a lot more ranty than reply-like than I wanted.
And for god sake am I sick of brushed aluminum. I find it ironic that a platform that so many artists use has the most boring appliance like look possible.
That's not irony. That's misapplication of personal preferences.
Also, what is the obsession with OSX? It's just a more limited version of Linux with a bad window manager.
With actual software support. And developers who show attention to detail in the presentation of their software. Some people, and I am among them, find that gratifying, and it makes me happy. Even if I could get Photoshop on a Linux machine (and, FWIW, if you suggest WINE you'll be demonstrating exactly what I mean), I would still gravitate to OS X. Because using it makes me happy and using even modern Linux distributions frustrates me.
>certainly no Linux distribution comes close on the "pleasant-to-use" part;
What have you tried recently? I spent an hour installing Ubuntu and running a script to get the power management tweaks on my MBA and it's far better than OS X was.
Since Leopard I've been fighting OS X more and more. It gets in my way, it makes things unnecessarily hard, I'm tired of having to fuck (sorry, but it's an accurate expression of how much time I've wasted) with homebrew and macports.
On the other hand, I installed Elementary OS Beta and tweaked the fonts and was perfectly happy with the out-of-box experience.
In the last six months I've tried Elementary (closest), Ubuntu (Unity and Kubuntu), Mint (MATE and Cinnamon), and Fedora. None stuck. And it's not really the distributions' fault, but rather the applications and all the little things. Ricardobeat put it nicely elsewhere in the thread: design-by-committee. There isn't a feeling of focus from one end to the other and there is a sense of welcoming to applications with crap UX that do something important. I don't like that, and I don't feel like that's common on OS X (like, even their weird Calendar skin doesn't get in the way of a good application).
It's an OS built by a lot of different people doing lots of parts in different ways. I'm not saying that I was told anything different, that's what it says right on the tin. But I find that doesn't make for a great user experience. (To Elementary's credit, activities where you can stay within the Elementary suite of applications are really pretty nice! But that makes being thrown outside at the likes of GIMP even more jarring.)
You should take a look at Xfce, by far the most consistent and not-in-your-way DE among the popular ones since Gnome2. And with some themes that actually look good (like Greybird, default in Xubuntu).
> Ricardobeat put it nicely elsewhere in the thread: design-by-committee.
I feel there is a common misunderstanding what this term actually means. I'm a Linux user and I don't like most DEs and most themes, but I wouldn't imply there is no leadership among their creators or that the decision process is completely flawed. Actually I think they know exactly what they are doing, the fact that you and I don't like it it's another thing.
I gave them as much of a chance as they deserved. I already own a Mac with an OS I very much like. If you don't impress immediately, you're not going to get traction.
And it's not like I'm new to any of them. I ran Linux nearly exclusively from 2005 to 2009.
> There isn't a feeling of focus from one end to the other
Like what? I hear this repeatedly but it's with a helping dose of handwaving and a lack of specifics. I guess I just don't see that. I use VLC, Gnome-terminal, Sublime Text, Pantheon-Files, Chrome, Firefox and that's basically it. They all act the same as I expect in any desktop environment and I've literally never had a meta-moment of worry or thought about something being "off".
Gimp is awkward on every platform, what does that have to do with Linux DEs?
Off the top of my head, talking about the Ubuntu 12.04.1 I use daily on two different Thinkpads:
I have to log in and out or reboot pretty much every day to solve X or window manager problems. There are a variety of focus and window ordering issues. Coming back from sleep each one has different display twitches; nothing worse than the usual Linux user contempt, but definitely unpolished. The Unity stuff is going in a good direction but I don't think they've been doing a ton of user testing; it also frequently feels clunky to me. Network Manager is slow to find networks and regularly crashes. Device support is so-so: the Android tethering that is support to magically work never does; plugging in a phone or a tablet frequently requires command-line incantations and is often buggy. One of my devices has a touch-screen; a minor kernel upgrade broke this a couple months back, and it is still broken.
I think they're right to go for a better UI experience, but right now I think they're in a valley between the solid but ugly Linux experience I was used to and the solid but friendly experience that Apple has inspired them to pursue. Right now I think it's flaky both in a usability sense and in a technical sense, which I find thoroughly disappointing.
>I have to log in and out or reboot pretty much every day to solve X or window manager problems. There are a variety of focus and window ordering issues.
These aren't specific. I honestly don't know what you're talking about here. I've logged out of X to solve a problem once - that was checking "Use the NVIDIA driver" and then rebooting. The only time I've ever rebooted to "fix" an "X problem". That having been said, I won't defend compiz for a second. That was a mistake and they're sinking more and more money into it to my disappointment.
>Device support is so-so: the Android tethering that is support to magically work never does
My Galaxy Nexus, Droid 1, Nexus 4 and Nexus 7 3g - I tap USB tether and about 5 seconds later Ubuntu tells me it's connected. (Unless this has regressed in 12.10, I haven't had reason to use it lately.)
>Network Manager is slow to find networks and regularly crashes.
You can get NM to crash? I hope you've filed a bug report because that is a major issue.
As long as we're talking about things that make me crazy about Linux, here's another fine example.
You asked a question. I took some time to give you an answer. Did you thank me? Did you acknowledge the information as relevant? Did you ask follow-up questions to better understand the situation? No, no, and no.
Instead, you seize upon the parts you can most easily argue with. You use scare quotes to imply that I must be an idiot. And you tell me what work you think I should be doing.
It's not my job to educate you. It's not my job to convince you of things you don't want to believe. It's not my job to file the particular bug reports you think best.
Your behavior here exemplifies the arrogant, self-centered, argumentative idiocy that has repeatedly driven me away from participating in open-source projects. I've got better things to do than to fight with people on the internet. Especially anonymous ones.
X is pretty weird sometimes, you switch displays, or suddenly connect a third display, the only way to fix it is to reboot X. Sometimes it works, and sometimes it doesn't. I honestly don't know X well enough to debug the problem, so restarting X is just the faster solution. I'm guessing you probably don't us X on the road as much, I feel like this is a normal thing for me -- restart X to solve problem with weird display.
Ah, if "on the road" means plugging and unplugging external monitors, you've caught me in my "oh I should have thought of that" spot.
I haven't had to do that in a long while. I have a permanent desktop machine so my laptop is only for at coffee shops or working at a friend's place. I can certainly understand how hotplugging can be a disaster. Thankfully, I have noticed an improvement with that with Nvidia's more recent driver releases, even possibly as late as the ones resulting from their work with Steam.
Anyway, I've been running Wayland builds on and off for a month now, let's hope that paves the way for minimizing these problems even further.
You never have to plug in your laptop to do presentations or anything? I feel like I do that all the time... plus you never know what type of lcd projector they are using. Sometimes a guy comes and tries to help you, but when they notice you're running linux they sort of back off.
I'm not that crazy of a dev and there were at least 4-5 things that straight up wouldn't build in homebrew or lacked scripts. Basic GNU tools shipping with Mac OS X are vastly out of date (and in some cases buggy) due to GPL issues. I finally gave up after an afternoon of fighting getting GDB signed to be able to debug my Golang app and wound up installing a VM, adding a new SSH key to Github, setting up my dev environment and debugging it in linux in less time.
I used OS X because I had a MBA. When classes were over I decided to see how much work Linux would be on it given that Mac hardware and linux don't always get along. Basically everything worked out of the box though, minus a powersave script I ran.
Well, let's just say a lot of things don't work in linux either. Depending what you're building, sometimes it's reliant on a certain gcc version, or distro. Generally speaking, compile from source is usually the last resort if the package can't be found via apt-get, and occasionally you'll have to tweak the makefile or the ./configure script settings. Any time you're building something that isn't extremely popular, you're likely to get build problems.
So far, the ratio of build issues I've run into in OS X to the build issues from version mismatching in linux is approaching infinity.
egrep, gdb, and the few other things I fought last month in brew were not uncommon things. They were all things that were preinstalled in Ubuntu. Furthermore, I haven't had to build anything from source other than the projects I collab on in years.
I don't know why we're quibbling about this, it's generally accepted among Mac devs... it's a small price to pay if you like the OS X user experience. I haven't for some time now so the developer tools lacking and the competition of better linux DEs have tipped that scale. Trust me, I still love my MBA.
You have to be joking, Microsoft is way more hacker friendly than Apple. Try to build your own OSX PC and you can't because Apple won't sell you a license, Apple started the whole walled garden game with the iOS.
Well, yes. Microsoft is a software company. They're happy to sell you the software to run on any device. Apple is a hardware company. If you want to run their software on a non-Apple piece of hardware, you're not their customer. In fact, in the early 90's when Apple did allow for Mac clones to be built, that almost killed the company.
No, those who complain about Apple being control-freak are control freaks themselves: because they want to control everything. The problem is, a lot of users are just happy not having to control, they want things to work, not to spend time tweaking.
Wanting to control something I own doesn't make me a control freak; wanting to control someone else's stuff does.
And even if those people were hypocrites that wouldn't make them wrong. It doesn't matter if some of apple's customers are control freaks. That doesn't magically change whether apple is a control freak or whether being a control freak is bad. Your counterargument here is basically the word 'no' followed by an irrelevant insult.
Its because they make very high quality products. I changed over after being sick of my linux laptop never connecting to wifi, and the full-disk encryption on ubuntu never working. It took some getting used to, but Macs are really great products that are great to work with and write code with.
I don't think it's just aesthetics, they ask a lot of money for a new charger. And because they conveniently chose to use a patented charger thingy nobody else is allowed to make, you're going to pay them that bundle of money.
I'm convinced Apple designs their products to break quickly, especially components that are not covered by warranty (e.g. cables, chargers, earphones, etc). Those inexpensive components render the whole thing pretty much unusable, so you're forced to buy a replacement. It pisses me off.
Yet, if I were to buy a new laptop, I'd probably buy a MacBook. They're just so damn nice to use.
A big thing for me are the Macbook trackpads (and the standalone magic trackpads). They're just so awesome to use and no other manufacturer's comes close to Apple's.
In the last year I've gotten practically addicted to my Magic Trackpad. I've been using gestures on my MBP for a few years, but when I got the trackpad for my iMac at work last year and stated using it full time... wow.
If I want to play a game, I'll use a mouse. Same with drawing (or I'd get a tablet). But my my daily work of programming, surfing, email, etc the trackpad works great.
But the thing that has me addicted are the gestures. Three finger swipe left or right to go forward/back in Safari, IntelliJ, Mail, and every where else. Four fingers up to trigger Expose to show all my windows. Expand all fingers to show the desktop. Pinch-to-zoom is one I don't use much, but when you want to look at a larger version of a screenshot on a web page it comes in handy.
It seems like such a little thing, but I'm lost without the back/forward gestures. They are second nature to me now, and when I use someone else's computer I'm constantly frustrated by not being able to do it.
I'm sure there are gestures in other OSes at this point. Two-finger scrolling was an instant hit with me. After trying it, the PC laptop solution (special scroll areas on the trackpad) just seemed so pathetic. Two finger right-click was great. I'm still consistently surprised to see relatively new PCs with small trackpads. My boss has a 17" Alienware he bought last year, and it has a much smaller trackpad than the 5" pad on my 15" MBP.
People forget what a company that is friendly to hackers is like.. if indeed they ever knew in the first place. They see "Oh, it's a UNIX?" and think that is the end of the story, the best you could possibly ask for.
I don't think anyone thinks that. I think that for many developers it's a balance between a solid platform to write code for and a user interface that appeals to them. I flat-out won't go back to a Linux desktop environment until one demonstrates as much basic give-a-shit for building a great computer experience as Apple does. That's as important to me as being "friendly to hackers" because I have to live in front of it every single day.
If you don't claim to have chosen Apple because the company hold good hacker ideals, then... okay? Good for you?
My concern is with those that have. You think those people don't exist; if not, then my concern is with nobody. I suspect that is not the case. In fact, I know it is not the case, with at least a small subset of Apple users I have met personally. To be fair these particular people are not affording Apple any undeserved 'hacker cred' for having a UNIX operating system, but rather giving them the 'hacker cred' for the legacy of Woz. Woz is great, but that is irrelevant.
Weird, maybe everyone you know is old. I don't think I know anyone that cares about Apple because of Woz. Many don't even know who the heck Woz is anyway. Apple has hacker cred because they do have lots of engineers working on products that we use, like gcc-llvm, or webkit rendering engine, etc.
I am not saying that everybody I know who uses Apple products thinks this way. They are a minority.
> at least a small subset of Apple users I have met personally
The people who give Apple undeserved 'hacker cred' (a minority among Apple users) all seem to provide justifications along the lines of pointing out how Apple used to have deserved 'hacker cred'. Invariably they are Woz fans (as am I, lest I give off the wrong impression).
No, it's not as bad as Microsoft. The software is actually reasonably fine, it has its warts, especially when Apple gave their incessant mania to reinvent the wheel to run free, but OS X provides reasonably comfortable environment. If you stay out of iOS prison-like ecosystem, Apple's software solutions are reasonably hacker-friendly, I think.
The hardware is not that bad functionally too, in many aspects better than the competition, but there are much more things that suck. Primary of it being connections to outside world. There's still no proper docking solution for Macs, which is just shameful. Power connectors universally suck through all recent Apple hardware. Macbooks power sucks, iPod power sucks, iPhone power sucks. One can't help but think it's done deliberately. And on top of that they don't even agree to be charged from any USB, some just don't work, no idea why. I'd say "worse than Microsoft", but as Microsoft doesn't make any comparable hardware, Apple is left in a class of itself, without peer to be compared to.
Not for me. I want a nice, reliable machine with unix underneath and a sane GUI on top so I can just open it and hack away. My Mac gives me exactly that. Long gone are days when I used to look at Linux boot screen flying by and felt being a hacker because of that.
You don't have an emotional bond with good, well-crafted tools? They don't make you happy to be using them?
I do. It's why I consciously pay a premium for a Mac. It's why I pay for Photoshop. It's why I pay money for IntelliJ when Eclipse is free. They're good tools that make me better at doing things I want to do and they make me happier because I'm using them.
Thinkpad fans are probably the only other group that likes a product line based on hardware. They may be smaller and/or less vocal group, but they're out there.
For me using tools is the only sacrament. It's tinkering and improving and learning from them that makes me human, besides keeping me from starving in a cave somewhere. Contempt for tools would have been an unimaginable luxury just a few generations ago, and I'm disturbed to see it steadily emerging here.
My macbook is nearing 3 years of age. The 'weaker' rubber sleeve in the charger is still good, same goes for four other Apple mobile chargers and headphones. Meanwhile another laptop I have has basically crumbled into pieces after 4 years. The screen is blotched, burnt, uneven, the keyboard is failing. It heats up like a toaster. Power brick had to be replaced a couple years ago.
I completely agree. You cannot put your own software on it and if you have one you're forced to take an extra set of cables when you vacation, and you need to baby those cables because adding protective sleeves to the ends "isn't the right design" but a new one is priced like they are made from gold.
I'm asking them the same question. I backed this as an android user looking to charge my power hungry devices. The iPhone compatibility was secondary for sure. I was willing to be patient while they figured out the lightning plug thing and am extremely disappointed that they're cancelling (and siphoning all the money through their startup) instead of releasing a product.
Siphoning implies they are doing something untoward with it, which does not appear to be the case here. They're already building something that will let them do a mass refund; they have decided to use that tool to send out this money, at some cost to themselves. What could be wrong with that?
exactly. I know the founder. He originally had no plans of creating an alternative to kickstarter. He wanted to create great hardware products and use Kickstarter to fund the development. After running into Apple approval issues for POP, realizing how a less than honest company could ship a shitty product and keep the $140k that backers had provided, he realized just how bad kickstarter is for funding & creating hardware products. Not only is he giving the money back - this inspired him to create a better alternative for creating and backing products. It is a classic founding story.
I'm a bit skeptical as well. The alternatives to giving up seem too obvious. Perhaps they didn't crunch the numbers right and were going to take a bath on the building them. Assuming its really just a problem they had with Apple, I don't see why they didn't provide a poll asking their customers what the their preference was (refund or design change). If nothing else, just bundle a lightning cable with the device.
Having a plain USB port or two would be handy anyways as there are still plenty of proprietary USB cables in the wild (GPSes, oddball electronics, older electronics, iPod shuffles).
They seem to have 2 female USB connectors already. They could have just changed the 2 30pin connectors to USB as well. Or have 4 USB + 2 Lightning..
Not sure if they're also supporting the extended dock connector functionality, and hence had to keep the 30pin connector.. but I agree that it seems like a press grab..
> Apple has refused to give the project permission to license the Lightning charger in a device that includes multiple charging options.
> In fact, even combining Apple’s new Lighting connector with the old 30-pin connector in a charging device was verboten
Emphasis my own. That doesn't read like combining Lighting with the 30-pin was the only issue, but rather the particularly egregious seeming issue. I'm reading that as Apple would not allow Lighting and USB either.
Yeah, if it were my product I would have refunded everybody's money and immediately launched another campaign for the all-USB version of the product.
Announce the new campaign in the same email that was announcing the refund; if people were still interested they could fund the new campaign.
Maybe they do plan on re-launching eventually, or maybe they just don't think there is demand for the compromised product. Or maybe they ran into other issues and are using this as a graceful out. Who knows.
Part of their pitch is that you don't have to carry your cables around (they did include 2 USB ports at the base just in case).
But I guess their mistake is pitching this as an iDevice charger from the start (see their video; very heavy focus on iDevices). And when they couldn't include a lightning connector, it became a deal breaker.
If you buy a phone you get a USB to Lightning cable. You can buy more of these if you want. Why is this a deal breaker?
There are a lot of devices out there that aren't Lightning and would benefit from a system like this, so 4 regular USB ports is a lot more useful than specialized connectors.
I agree. This seems like a problem with a trivial solution. My bet is that they decided that they wouldn't make enough money on the project to make it really worth their time. Blaming apple may be justified since it is a stupid policy, but I'd just as well have a system that just has four USB ports that I can plug whatever I like into.
I thought you can't charge apple devices with different usb devices? I have an apple tablet at my work and my pc does not charge it, I have to use my friend's mac.
Per spec USB2 output max 500mA per port. This is enough to charge an iPad when it's asleep, but as soon as the screen is on it will stop charging to power itself.
The wall charger and Macs include a non-standard trick (a resistor on the data pins in the wall charger, probably some active negotiation on Macs) to output up to two full amps.
Higher charge rates are part of the USB3 spec, but this would require both devices to implement it, and I don't know if it's even a mandatory part of the spec.
So, plug your iPad on a PC and it will charge, just slower, and only when sleeping.
You'll probably find that it does trickle charge (though doesn't own up to it), but just barely.
Your PC likely isn't delivering enough current via USB to charge an iPad. (I think it needs a 1A supply and the USB spec only requires 500mA or something like that).
Apple refuses to license any product which combines the Lightning connector with any older Apple connector.
Thus, buyers would have had to use the 30-pin to USB (or USB to Lightning connector, which is apparently not available in the US?). Either way, this was not the product that they Kickstarted. They decided to do the right thing, by returning the Kickstarter money. The company behind the Kickstart is contemplating whether to have a new kickstarter that would feature Apple connectivity by way of the usb-to-apple connectors (rather than through dedicated connectors like they have/had in the current version).
I think their mistake was Kickstarting something for "Apple iPhones, Apple iPads, Apple iPods, other Apple devices, (and oh yeah some of those other things that make up the majority of the world's rechargable devices too)".
It's just plain unwise to make business promises to deliver interoperability with proprietary interfaces without having your agreements in place first.
As a side note, I'm quite disappointed that we as a technological society can no longer implement basic means of supplying low-voltage DC power for a users' own devices.
Except it wasn't a mistake. Just because it failed doesn't mean it was a mistake.
We should be admiring them for taking risks like these. If it worked, it would have been quite awesome. And now that it's failed, everyone received their money back, so no harm was done whatsoever.
Yes, it was their choice to return the money rather than pivot the project, but that's irrelevant -- the fact is that they returned all the money to the backers and chose to absorb the $11k loss themselves. This is behavior we should be encouraging, rather than trying to discourage people from daring to dream and to take harmless risks.
But just I hate to see anyone having to eat credit card fees on $139,170 for something that was preventable.
Power connectors have historically been low-risk parts so it's an understandable mistake. But sourcing and costing all the needed custom parts should have been one of the first steps in the hardware design process.
But just I hate to see anyone having to eat credit card fees on $139,170 for something that was preventable.
It was their risk to take! Why do you care?
Power connectors have historically been low-risk parts so it's an understandable mistake. But sourcing and costing all the needed custom parts should have been one of the first steps in the hardware design process.
So what if they didn't? It's their decision.
I'm struggling to understand what your point is. "Here's why [I think] they failed." Well, yes, in hindsight, mistakes are obvious. But in the moment, things aren't so clear-cut.
It would have taken a single phone call to Apple's licensing group to learn that this would be a problem. They didn't make that phone call. That was a mistake.
It would have taken a single phone call to Apple's licensing group to learn that this would be a problem.
That's not true. Apple saying "no" when there is no money on the table and no popular interest means nothing.
Apple saying "no" when you have customers beating down your door and $130k to divide between licensing and manufacturing is a different matter. Apple chose to pass up this money. There was no way to foresee this until the choice was on the table.
Building first and asking permission later is sometimes admirable.
But one ought to weigh in advance whether or not 1000 customers and funds on the order of $130k is likely to provide enough leverage to bend the rules of a company with, literally, the most money in the whole world and a famously proprietary attitude.
According to the article it sounds like they started on the design before Apple announced the new connector, and then updated their design once the connector was announced. Also it seems they did contact Apple but "[they] didn’t get a yes or a no up front,”. Once they finally got a straight answer from Apple they canceled the product. So apparently they did make that phone call and it wasn't as simple as you seem to want to make it.
Before they did the kickstarter, they were fully in the right by Apple's own guidelines and policies. Even now, they're in compliance with the public ones, just not the 'just about to be released even more anti-competitive ones'.
You get a usb-to-lightning cable with any new iOS device, that's what you would use with it. By just providing a female usb port, bring-your-own-cables, they wouldn't need licensing from Apple at all.
Just want to add some points to this thread as I am the inventor of POP.
For those who question our motives, your reasoning is fair but let me clarify some things.
We started Christie Street because of our our experiences with POP. We have been getting closer and closer to this decision since Apple announced Lightning and began its new rules on the adapter. While we were going back and forth and waiting to see if we could still build POP we realized that we did not have a good way of refunding to our customers. That is why on Christie Street we built a automated refund system for our inventors. So when we finally realized that POP just could not be made to what we promised, we wanted to use our platform to do the refunds as it would be easier for us to manage. Also it would allow us to test and tweak the system with real customers. We are all trying to build better companies/products and looking at a loss on POP this seemed like a good way to at least salvage something out of it while delivering a above average experience to our customers, I would call that a win-win.
Lots of people also seem to want POP even without Lightning and our upset that we did not poll our customers to see if a USB only version. But they are missing the point, we said in our campaign that we would support the iPhone 5 and at the time had no reason to believe that would be impossible. Why would we have thought that they would sell the new adapter any different then the old 30 pin?
We are not willing to compromise on the product or deliver something that was not as promised. If that brings skeptics, so be it but at the end of the day today I know that we are doing the right thing, not the easy thing but the right thing.
> Why would we have thought that they would sell the new adapter any different then the old 30 pin?
Before the new connector was announced there were rumours that licensing for it would be more restrictive. Also notice how you can't license MagSafe at all.
I'm not saying Apple's decision is the right one, but IMHO it was clear there was a risk regarding licensing.
This would've been a better product if it just had a general-purpose retraction mechanism and let you bring your own cables. Apple may have been jerks, but they could've designed around it and made a better product instead of just throwing up their hands.
Wasn't there an EU ruling a few years ago saying that cell phones needed to start using micro-usb connectors? How is Apple getting around that, and still selling phones in the EU with proprietary plugs?
You are misinformed. There may or may not be DRM according to the teardowns. There is certainly a bit of silicon which has a minimal DRM capability, but it may have come along for the ride. If there is DRM, it is not enforced. There are knockoff connectors out there.
The connector electronics are there to allow higher current charging by allocating more pins to power transfer when they are not needed for data and to allow as yet undeveloped higher speed protocols to operate on the same connector.
If there is DRM I can see Apple's point. When you destroy an iPhone with a cheap charger that fails, it costs Apple money to replace it. If you attempt to reverse engineer the connector, you will be wrong. The future capabilities are not present for observation. You will probably create a device that behaves improperly for future protocols. Will you destroy that future device by pulling -5V on a low voltage differential data line?
I really fail to see how opening the options to which plugs would be supported by just offering a USB Type-A adapter and providing some cables or having the user supply their own is "compromising" their product, unless it makes them too similar to products that are already on the market and have been for a while. I think this is more a failure of execution than it is Apple "killing" the project.
It is an issue of not being able to provide the product that people have given their money for.
If I were in their position, and my priority was operating ethically not successfully, I would refund all the money as they have done, then start again, pitching an only USB product.
Not quite convinced by the argument that 'Apple killed' it. IMHO it's always dangerous to go for one plug design; given how those plugs evolved over time (Think USB, micro USB, mini USB for Android, etc.). I quickly amassed in my drawer a plethora of unusable chargers. Only the Apple port had the longest lifetime across multiple devices. I would say, the kickstarter idea could use it as an opportunity to redesign an initial flaw of - an otherwise nice charging station ?
They can't. The point of it was to be a charging station that could charge everything. Apple's updated more-anti-competitive license changes make this impossible by forbidding anything from having a lightning connector in conjunction with any other connector. It's basically Apple forcing 3rd parties to make docks, speakers, etc that ONLY work with new Apple products and won't support anything else.
They could have simply provided a standard USB-A port. Trying to put all possible connectors on the thing sounds like a fools errand, regardless of Apple's stance.
If you look on their kickstarter page, they actually did provide USB-A ports on the bottom of the unit. They just weren't able to 'compromise their product', which makes me believe that perhaps they had other underlying issues, unrelated to Apple's licensing.
There are already a glut of portable chargers on the market that provide a standard USB-A port. The point of this device was to stand out by providing the Lightning (and other) connectors without needing additional cables.
Wait, why does Apple even have the power to do this...?
It's a connector... Why on earth is it possible to patent a connector that doesn't involve any particular innovation (take wires, rearrange a bit, tweak shell shape) in the first place...?!?
Well, it's not really drm -- the chip in there is used for lots of stuff. There was an article about the thunderbolt cable (which has a similar design to lightning) where it has a chip inside that cable as well. Note that thunderbolt was actually designed by intel.
I think lots of future cables will have these chips in them. In thunderbolt, the chip mutiplexes & demultiplexes the data. Perhaps the idea is that speed increases can be had just by upgrading your cable instead of upgrading the port.
In the lightning cable, the chip is also responsible for determining the orientation of your connection, since the connector is reversible.
The pins are on the outside of the connector. It's trivial to short-circuit them by accident, which could easily cause a fire if there were not a chip in there to control when you "shove 5V over the wire".
Doesn't DRM stand for digital rights management? I think you may have confused it with something else. DRM doesn't prevent people from manufacturing cables -- it's to prevent the end user from accessing content they don't own. DRM was never aimed as a way to reduce the number of manufacturers out there. In fact, DRM wanted to increase the number of manufacturers so that there would be less hardware that could play media that you didn't buy (pirated content).
DRM prevents digital data/signals from going where an end-user wants them to. So you cannot play DRMed media on a open-source player, but only players blessed by the DRM-vendor. The end-user has his choices artificially limited.
And now the end-user cannot use any cables he likes. Only those cables blessed by the proprietary connector vendor. You have authentication where none is needed, exclusively to give the vendor power, not to provide the end-user with benefits.
Both are about digital data/signals and having artificial restrictions imposed on them. I think the similarities are good enough to warrant the name DRM.
To the layman it may appear that way, but I assure you the lightpeak cables don't have any DRM logic in them, they merely mux/demux signals. I think you're just a bit misinformed, or think anytime there's chips in basic things like cables, to consider them DRM devices. I suggest you wikipedia DRM to brush up on your definitions.
If there are more adopters of lightpeak, there will be other cable manufacturers, and users can buy any cable they want. People who don't understand electronics also wouldn't understand how this might benefit the user.
Personally, I think lightpeak is a fairly interesting way of implementing something, it enables ports to be a lot more capable without having to replace electronics on the motherboard.
Perhaps users who don't fully understand what is going on inside would prefer to think that it's something evil...
I haven't seen the connector myself, but I would not expect it to be devoid of all innovation. But I do find it somewhat disturbing that after 100+ years we're still finding patentable modifications to ordinary low-voltage plugs.
Someone should make a Kickstarter for a compatible charging connector that unambiguously avoids all the patents. Something dirt-simple like just a few bare metal pins sticking out providing voltage and alignment.
You can't. Apple built DRM into their cable with a proprietary security chip. All DRMed and patended up nicely to lock everyone out of just about everything.
From your link, the discussion doesn't seem to have a definitive conclusion:
The folks at Chipworks has done a more professional teardown, revealing that the connector contains, as expected, a couple of power-switching/regulating chips, as well as a previously unknown TI BQ2025 chip, which appears to contain a small amount of EPROM and implements some additional logic, power-switching, and TI’s SDQ serial signalling interface. SDQ also uses CRC checking on the message packets, so a CRC generator would be on the chip. Somewhat confusingly, Chipworks refer to CRC as a “security feature”, perhaps trying to tie into the authentication angle, but of course any serial protocol has some sort of CRC checking just to discard packets corrupted by noise.
So until someone finds a truly dumb Lightning charging cable, the question as to whether or not DRM prevents it is alive.
Regardless of this particular issue, can anybody explain Apple's stance on this? Why are they forbidding Lightning connectors on products alongside other types of connectors?
Although the simple explanation is "Apple is greedy and wants to milk their users with overpriced charging cords" I think the real explanation is more complex.
Apple wants to control every aspect of how its users experience the products that they think of as "Apple". When Apple sells a charger and charging cord, they want it to be a charging experience worthy of the Apple name. I know it sounds ridiculous, but this is how they think.
Every consumer eventually ends up with a big tangle of chargers and adapters. The kind of thing where you unplug it and tug on the cord and the FCC-mandated RF balun is a big lump and the cord gets stuck and you have to crawl under your desk to retrieve it. Most of us live with that kind of thing every day and most normal people hate it.
In short, Apple wants its users to experience its products as a refreshingly simple and clean salvation from modern electronic hell. The existence of a single cable that mixes the new connector with the old connector is seen to tarnish that user experience.
All the other things except iPhones almost universally charge via MicroUSB now. It's hard to end upwith a "big tangle of chargers and adapters" when all you need is a MicroUSB cable.
I'm not defending Apple's position here, just trying to imagine how they think.
I love MicroUSB. (In fact, I got a tip that Santa Claus will be bringing everyone in my family MicroUSB cables for Christmas! Shh!) But somehow I still have a big tangle of cables (USB and others) on my desk. Admittedly, none of them are Apple.
Although the simple explanation is "Apple is greedy and wants to milk their users with overpriced charging cords" I think the real explanation is more complex.
Occam's razor says that everything you wrote after this paragraph is wrong.
To stop people damaging their phone. I bought one of those iPhone 5 docks off eBay and threw it away immediately after it started to scratch the pins on the phone.
I'm speculating here. But other than the ChristieStreet motives people are throwing around, it might also have been prohibitively expensive to get a 26,000Mha battery into the device at that price point.
Sucks for them, but can't they make the thing a USB hub? BYOC. Who do I need their cable? USB-A ports are more practical. If I have an Android, why would I want iPhone chargers?
Maybe you have friends, family or loved ones that use iphones that may be at your home that want to charge their phones? No. Then this product isn't for you. I personally am in love with a woman that comes from a family of mac users, perhaps you can understand my dismay when I stayed at her brother's house for two days and he didn't have a single MicroUSB cable and he's a tech geek/developer with 3 computers (MBP, Mac Pro and Mac Air), 2 tablets (Ipad 1 & 2) and 2 (i)phones. I've never expected that someone could have so many tech devices and not have a billion usb cables like most of my tech friends do.
Then there is the fact that I'm competing with my old lady for plugins for our phone cables. We live in an extremely old house and the amount of plugins are limited as is the amount they can be extended.
It's not a bad idea, I can see how it would be very useful for some people.
The thing I'm having a problem with is they must have seen this coming. Everyone knows Apple is a control freak. This was entirely foreseeable, and getting the requirements from Apple shouldn't have been due diligence. At a minimum, they should have planned for this possibility from the get-go.
I found this quote from the project's letter about the situation impressive:
"Providing full refunds means we will have to absorb a hit for both credit card (3%) and Kickstarter fees (5%) totaling over $11,000. Today we asked Kickstarter for the 5% fee they collected based on the circumstances, however regardless of their decision YOU WILL RECEIVE 100% OF YOUR MONEY BACK."
I'm sure plenty of people wouldn't have had a problem if they'd passed along the various fees. After all, Kickstarters are still speculative projects that can fail even after funding. Eating whatever fees they can't get refunded is seriously going above and beyond to make things right for their backers in a difficult situation.
What does it even mean for Apple to license this connector? Do they have a patent on it? It's just a fucking plug. Or are they just refusing to let these folks use some trademark to mark their product?
What does it even mean for Apple to license this connector?
Apple built DRM into it. You cannot work with it without Apple giving you the keys first.
Yes. Apple managed to sell the iFanboys a proprietary, DRMed USB-connector and make them believe they were getting something better than everyone else.
If that make Apple geniuses or Apple's customers retardec is your call.
I think most of the lightning cables actually have a chip inside, like thunderbolt. It may look like a simple cable, but there's some electronic stuff inside.
Apple has a program called MFi, usually you're supposed to get a MFi license if you make hardware for the ipod/iphone family. Since idapt doesn't list MFi anywhere, I'm guessing that they don't license it, and they aren't popular enough for Apple to sue yet. You don't have to ask Apple if you're just going to violate the license anyway... as long as you're small, Apple probably won't care. I think these guys tried to do the "right" thing.
I'm sure there are a few dozen chinese companies that violate all kinds of licenses that nobody really cares about. There's this chinese company that makes an iPhone knockoff that is actually called "iPhone" -- saw on the streets of Hong Kong. How do you think they do it?
1. Apple rejected the cable. To those who say "just go USB", it was
supposed to be a seamless product and that destroys aesthetics.
Image is everything in product design.
2. In parallel they were working on Christie Street, to solve the
issue of non-refunds on Kickstarter.
3. So why not turn something bad into something good? Get some press
for their new site, and also issue refunds for everyone. How else
would they issue refunds? And how does anyone lose in this situation?
If anything, this seems like a great and fair way to deal with the issues at hand.
Why not just continue the project with a few other connector types, and separately make an adapter for Apple's connector (authorized or otherwise), or recommend an existing something-to-Apple-connector cable?
So I'm looking at my monoprice external battery pack. It says specifically that it works for iPod,iPhone, and iPad. Is the problem that they built in the cables into the device instead of just outputting to a usb hub with specifically defined output rates?
I applaud Jamie and the POP team for going for it -- everyone who develops on a proprietary platform like Apple's (apps, hardware, whatever) faces this risk. It's a risk you have to take if you want a shot at success.
Nothing like calling Apple a bunch of a-holes in the press to make you look super professional. Siminoff sounds like a whiny baby who got mad and flipped the risk board. At least all the backers are getting their money back.
I don't blame him for dealing with his grief. He has a right to be completely "pissed", and he has to be very emotional because he cared so much for this project, he worked so hard...and he probably had some passion for Apple...and then this life changing opportunity is denied by the "totalitarian" Apple.
"We are pissed," Edison Junior CEO Jamie Siminoff told me on the phone today. "I think they are being a bunch of assholes, and I think they’re hurting their customers."
He was direct, didn't mince words, and yet he distinguishes between his opinion and absolute truth ("I think" x2).
Oh c'mon! When you build a product that is completely dependent on buy in from a independent third party you need to account for the risk and shouldn't be surprised if they exercise their right to not accommodate you. At very least it's not excusable to call them aholes because you got emotional about something that is so painfully probable. Apple owes him nothing and judging by his reaction they probably made the right choice. I certainly wouldn't want to associate my brand with that behaviour.
Apple is the antithesis of the hacker ideal. They're just as bad as Microsoft.
I mean, seriously. Have you ever been to a radioshack? Multi-charging devices are a common product. Yet apple will have none of it. It's clearly an anti-competitive measure aimed at making sure they're your only supplier.
Furthermore, apple's chargers suck. They deliberately have a weaker rubber sleeve around the end of laptop charger cables because it looks aesthetically nice. It's been proven that it's weaker than the conventional rubber joints on most laptop chargers, but they don't change it, because they value aesthetics over functionality.
Avoid if possible.
/me realizes he's using an ipod shuffle. oh well.