Happy to see our screen here on hacker news! figured I'd post the behind the scenes video here too since it could be of interest to someone on this site.
We anticipated some static from all the ribbons, so the superstructure is heavily grounded and made of aluminum, we also have aluminum plates inside each pixel module. But when our step-drivers started catching fire we bought a 3M 718 static sensor, it maxed out at 20kV so I actually don't know how much static we had on the ribbons. We ran tonnes of more grounding wires through the modules and that took care of the problem. But yeah, static caused a few fires here before we figured it out.
Would you have any interest in demoing this at the "other" breakfast in NYC - NYC TechBreakfast? Check it out at http://meetup.com/nyc-techbreakfast and drop me a line if so.
This is very cool, the integration of hardware, software, social and marketing is perfect. Many hours spent putting all the pieces together and testing and tweaking, but it makes me wonder if we are wasting talent?
I love what BREAKFAST does as much as the next person and they are an extremely talented team, but all of this amazing technology and innovation is going towards selling products for a brand that totally disregards factory workers wage rights so much so that F21 were "sued by the United States Department of Labor for ignoring a subpoena requesting information on how much the company’s suppliers pays the workers who make its clothes"[0]
This reminds me of the recent article posted on HN "Web Design: The First 100 Years"[1]. How many of these underpaid factory workers are the greatest minds of our time? "We live in a world now where not millions but billions of people work in rice fields, textile factories, where children grow up in appalling poverty. Of those billions, how many are the greatest minds of our time? How many deserve better than they get? What if instead of dreaming about changing the world with tomorrow's technology, we used today's technology and let the world change us? Why do we need to obsess on artificial intelligence, when we're wasting so much natural intelligence?"[1]
While I understand the point you are making, I would argue that the textile industry is a necessary evil for many countries. It's pretty well explained in this article:
>According to the comprehensive “Ashgate Companion to the History of Textile Workers,” a study of 21 countries over 350 years, nearly every nation suffered through its T-shirt phase differently. Argentina’s brutal encomienda system literally worked indigenous laborers to death. The Hapsburg monarchy’s T-shirt phase coincided with its own collapse. Japan’s progress was slowed by a world war; Germany’s was all but destroyed by two. New England’s textile workers had it relatively good; if conditions didn’t improve, they could threaten to leave for the frontier.
>All these countries, however, experienced the same broad phenomenon. Lex Heerma van Voss, an editor of the “Ashgate Companion,” told me that the T-shirt phase lasts only as long as there are large populations of farmers with few options. This is known as a “race to the bottom.” Factory owners compete by offering low prices, which are accomplished by paying workers tiny wages. Cutting costs by a few pennies per shirt may sound trivial, but mass-market brands find that even a slight increase in price destroys demand. And those pennies at wholesale become dollars at retail.
>But once the factories have absorbed all these desperate farmers, they need to find a new competitive advantage. That usually involves making better products. When the T-shirt phase ends, a “race to the top” usually begins. Factories often shift to finer clothes, like dress shirts, which require skilled workers. This phase often involves the growth of unions and rising wages. It’s typically followed by one in which factory owners, forced to pay more, seek out ever more profitable lines of business. That can mean the move to low-end electronics assembly, then auto plants and maybe even airplane manufacturing. At the high end of the spectrum, you begin to see what the U.S. manufacturing economy is going through now — expensive products, like medical devices, which are often made by machines that are operated by highly skilled workers.
As long as there is no slave work, this is, people are working in these factories "voluntarily", it may be the best option they have. These are people maximizing their utility and deciding to work for USD50-200/month with 6 day weeks and 12 hour days. Yes, it sucks, and I wish they had other options, but I'm not sure that boycotting F21 would work in their favor. As the article sadly states, this seems like a necessary phase.
It's voluntary because they can chose the second best option: Farming, fishing... And the second best option would have nothing to do with F21 or any other company that uses sweatshops. If you look at it from that vantage point, F21 is giving them a new option that they didn't have before. However, that's not the point I'm trying to make, since I believe that sweatshops have negative externatilities that I haven't accounted for.
Choosing anything other than the best option is irrational. Why would I choose second best over best? It's not really a choice, just a word game to rationalize disturbing facts.
I don't understand your point. You are arguing that working for 21F is the best option, yet you oppose companies opening up sweatshops in developing countries?
(Just for the benefit of anyone not familiar with the book, and meant only as appreciation, not criticism.)
"Then the gas-lights guttered in their copper rings, and the orchestra swung into a flat rendition of 'Come to the Bower.' With a huff, the limelight flared, the curtain drew back before the kinotrope screen, the music covering the clicking of kinobits spinning themselves into place."
Is there any technical significance in the "fade out" sequence that happens between the "F21" image and the next Instagram photo? It looks like once it gets to full black, a few pixels are "stuck" (or slipped I guess), and then something seems to detect that and correct the pixels, so I'm wondering if it's a kind of routine to zero out the pixels, or just an effect. The pattern seems to be roughly the same on each run, suggesting that if it is a self-correction algorithm, it's not persisting the offset/correction (or if the same pixels keep slipping each time, perhaps the algorithm could account for that by rolling those at a different speed).
Either way, this is really cool! I'd love to work on something like this as a job :)
> Each ribbon also features a reflective strip which is scanned by an infrared sensor, which tells the machine the color each spool is currently showing, allowing for corrections for any slip that may occur.
That's badass. I wondered about whether the machine would need constant calibration due to the belts slipping or hopping a gear, etc. I really dig solutions to problems like this.. I'm not sure how I'd categorize them, I suppose solutions that involve observing reality as directly as possible instead of relying on too many layers of abstraction.
EDIT: I guess now I'm curious... probably this is fast enough that you can literally scroll until you see the right color. I wonder if that's what's happening or if there's still an attempt to remember where on the belt it's located and make an attempt to take the fastest path there. It'd only be faster half the time though. I wonder if the overall experience would differ much.
This is what I was wondering. My theory was that they were using a camera looking at the front of the display to identify the color being shown by each spool. I think my solution is better :)
Given the project's scope and budget for the agency I guess doing this with actual threads would have made it very hard to maintain on site, magnitudes slower and result in a skyrocketing part count. So this compromise is fair and still impressive.
I too found it quite confusing, I kept trying to figure out where the threads were, was it stitching fabric real time?
Quite disappointed when I saw it was just colored bands spinning. I might have thought it was cool if I had gone in expecting bands, but expecting actual thread, and seeing bands was a let down.
That's irrelevant. The interesting functionality in this machine are the spools which cycle bands of colored material. If that material were colored with crayons or paint or threads would not change the behavior of the machine or the effect. The texture of the material might slightly change the appearance, but at the scale in the video I doubt it makes a difference, and different textures (e.g., textile) can easily be simulated.
Seems some pixels are not displaying correct colours on the live feed (most visible as white pixels when displaying the black background of the F21 logo).
I guess the spool belts have run out of alignment? I would have guessed they'd have implemented closed-loop positioning for the colour-belt, but it appears not to be the case.
It would have been nice with some kind of closed loop feedback system for all positions around the ribbons, but it became a cost problem. What we did instead was to put an IR sensor on the motor controller behind each motor. The fabric ribbons are fitted with a retroreflective strip for homing. This gives us one absolute starting point on each pixel and then each color is just at a fixed ustep offset from zero. But as you pointed out, some of the ribbons are slipping a little bit so they are a color our two out of cal. The ribbons are real fabric ribbons, not timing belts so that's why we do have some slip. We periodically run the strip over the IR sensor to zero out the calibration so bad pixels should re-align at least a little bit over time. If they are really bad, we take them down and swap in a spare 32 pixel module while tightening the belt on the bad ones.
How bad would it be to to always run the ribbon past the homing strip each time a move is called for? In the worst case you'd have to make just over a full loop instead of a small move, but the stepper motors could probably make the full loop pretty quickly at full blast.
Awesome project, congratulations!
Very cool, have you thought about using a RGB color sensor (like a TCS34725) to detect the color of the ribbon instead of finding an IR homing strip? That might help with slip since you know the order of all the colors and could move exactly until the right one is being displayed.
Yes! we actually prototyped that.. didn't have great success unfortunately, because of space constraints in our particular application, we have to have our sensors look down a 1/8" passage (passed the motor) down onto the ribbons. I feel like I tested every single optical sensor from digi-key before we finally had to settle with an IR sensor. We use a pretty strong IR LED and bouncing that off of a reflector on the ribbon. We simply could not separate the signal from noise trying to read the actual color of the ribbon. Moving the sensor off the PCB (closer to the ribbon) would have worked, but the cost increase for 6400 pixels.. yeah, couldn't do it.
You could have printed a quadrature pattern on the back for not much more complexity. Just one additional sensor per strip. Even that wouldn't be an absolute necessity since you know which direction the rotation is going. A single series of bars with one skipped or widened for home would suffice. Then you'd be able to compensate more readily for slippage and stretching.
Any reason why you didn't use a camera looking at the front of the display? That way you'd only need one sensor and the ribbons wouldn't need to be instrumented in any way.
> The fabric ribbons are fitted with a retroreflective strip for homing. This gives us one absolute starting point on each pixel and then each color is just at a fixed ustep offset from zero.
If you had a reflective strip between each color—and just counted the strips to determine current color—wouldn't that have kept it in continuous calibration? Then you wouldn't even need a stepper motor, you could use a plain DC motor, since you wouldn't be using the step count to determine position anymore.
(I haven't used that setup for a real project, but I intend to since it seems it would be cheaper and easier, and I was wondering if you considered something like that but dismissed it.)
Also, if you don't mind sharing: what IR sensor did you use?
Interesting... One further question: when does it recalibrate—is it opportunistically, when it is already running past the reflective strip (I'm guessing in-line with the black displaying), or every so often (after each image cycle)?
One workaround that could better hide the inaccuracy would have been to only have smooth colour transitions on the ribbons (including between black and white), as it appears to be a hard transition between blank and white that causes it to be so noticeable with the small positional error.
And ideally not just using an encoder to detect skipped steps or something. But using some kind of vision to close the loop using color since you're making an image.
At the very least put a Gray code on the back of every fabric loop to get actually accurate positioning.
Coding on the underside crossed my mind too, but then you'd need a multi-sensor tx/rx board for each ribbon, mounted inside and under the moving parts. And it looks like this mech doesn't have a lot of space to begin with.
That's why using a high res camera to image the entirety of the spools and make adjustments as necessary seems like the best way to close the loop. It won't be able to make minor adjustments but if something is really wrong it wouldn't be hard (algorithmically) to locate which spool is totally bonkers and get it much closer to right.
It weighs thousands of pounds (of aluminum!), using a 600-pound custom frame and an even more massive support structure, using 6,400 thread spools, comprising over 200,000 custom parts, and requires 24/7 temperature and humidity control, all to display an 80x80 pixel 0.02k color image. This took a year and a half to build, and it will only run for one week.
Kudos to the marketing department for giving BREAKFAST (WHY IS IT IN ALL CAPS?) employees a super fun job for a year and a half, but holy cow, this is a huge waste of time and money. They could have gotten more out of Facebook ads.
Interesting, reminds me of the graffiti robot for some reason . [0] Would be a neat follow on project to rapidly "print" via an actual loom to weave an entire fabric panel as the display. The thread aspect is neat however it comes off as an e-ink shortcut. It's quite an engineering feat that the majority of it works reliably as it appears (only a few dead "pixel" of thousands).
Very impressive. I'm sure people here would love to hear more about the challenges you faced and what kind of things were surprisingly easy to get working.
It looks like you followed the basic design of a Van de Graaf generator, so I imagine you had a very difficult time with static eletricity.
How did you handle alignment of the colors? Maybe a gray code and optical sensor on the back side of the bands?
https://www.youtube.com/watch?v=dvDHNDkO-Qo
Cheers