And the laptops only come with USB-C ports, and I'm yet to meet anybody who owns anything which can be plugged into one of those without a dongle (I suppose excluding their Mac charger :)).
I have one of the 2016 MBPs and while I bought a USB-A to USB-C dongle, I almost never use it. I have the USB-C to USB-C cable that came with it for power, and also (gasp) bought a couple new cables: one USB-C to Lightning, and one USB-C to Micro-USB.
I get that people are upset that Apple's gone full bore on USB-C "early," but they did that with USB, period. When the first iMac came out in 1998, it was the first computer to have only USB ports--and pretty much nobody was making USB peripherals back then. (And one of the two USB ports on the iMac was guaranteed to be taken up by your keyboard/mouse, so as USB took off, Apple was also early to the "not having enough ports" game. Heyo!)
There are legitimate things to complain about with the current MBP models; I don't like the keyboard, either, although I certainly don't have any problem with battery life. (My understanding is that it varies much more sharply than previous models depending on the system load.) I think only 2 USB-C ports on the low-end 13" model (the one I have) is too few. I think all of the laptops should probably have SD slots (c'mon, Jony, SD cards are thin, okay?). And at least so far, the Touch Bar hasn't justified itself. But when it comes to going all-in on USB-C, I'm not one of the ones who thinks that's a mistake on Apple's part. If anything, I think it's a mistake not to swap the Lightning port on the iPad Pro for a USB-C port.
Kinda hard to describe but Saver Screensson creates stylish, unique patterns on your display by stacking vector stencils. Screensson contains 340 individual images and 19 predefined color palettes, generating countless multilayered compositions.
It isn't much better for simple animations like this but I find it much easier for complex, gesture-based, dynamic animations that can change at any time based on user input.
Given that much of the Paper prototyping was done in Origami, this was my first thought when I saw this library. I haven't actually used Paper yet (it's still US-only) but I'm in love with these kinds of tactile, fun interfaces.
I think the main difference is they are easy to interrupt and continue from the current real position, where doing that with CA is more difficult. That would let you do some nifty stuff with gestures, I guess.
Take a look at the app that the OP is building. It's pretty interesting and uses this scripting bridge for generating OpenGL visuals. (http://github.com/aptiva/tranquil)
Might be cool to write a DAAP client that uses this so you get your Google Music inside of iTunes as a shared library. If I wasn't so loaded with stuff right now I might start a project. Oh, heck maybe I'll do it anyway.
Unfortunately the DAAP library I started with doesn't support some of the options that iTunes looks for. It works with Rhythmbox and Banshee on Linux, and someone told me that it worked with an Android DAAP client.
Unlike the API posted here, which emulates requests from the web interface, mine is based on the internal API used by Google's Android client.
It's far enough along that it can list the whole library, and a client like Rhythmbox can request and successfully play a song.
I stopped mostly from time constraints, but I'm hoping to have time to continue working on it. Things like seeking in tracks and playlists would be nice to have.