Hacker News new | past | comments | ask | show | jobs | submit login
Update on Novena Open Laptop (bunniestudios.com)
361 points by irq on July 6, 2013 | hide | past | favorite | 62 comments



I desperately want to learn more about board layout and system design. I think the costs of making custom hardware have fallen so much that a hardware renaissance is almost here, if not already. I've done embedded/kernel work before, so I theoretically have that side covered. :)

Does anyone know of a list of projects I should do to at least be able to pickup kicad/eagle to start laying out boards into gerber files into manufactured boards?

One suggestion I got was to design and implement one's own Atari Punk Console: http://www.makershed.com/product_p/mkjr2.htm

I also found this book: http://www.amazon.com/Build-Your-Printed-Circuit-Board/dp/00...


I'd say the best exercise to start with, is to buy some bare PCBs from an open source project or two, and then try to source all the components yourself. I think I learned a lot about component types/specs/sizing/packaging doing this, even though I had zero EE experience.

I tried modifying some open source designs, as well as designing a board from scratch, using Eagle and Kicad last year. I found the software so cumbersome and maddening, I gave up rather quickly.

I've stumbled upon some better tools since then. I hope one of these will get me from idea to gerber with less heartburn.

* http://www.layouteditor.net/

* http://razencad.com/

Those are both free. On the commercial side, there are a lot of options, but I'm tempted to just buy Diptrace, which seems to have a good array of features and reasonably-priced licenses.


Perhaps someone could comment on simulation software, particularly on the simple/beginner side of things?

For instance, recently I got some supercapacitors and had a few ideas for a simple circuit (consisting of resistors, diodes and capacitors). I did an EE unit when I was at school a long time ago but wanted to get a better understanding of how my circuit would behave, particularly voltage levels and discharge rates over time.

I was expecting there would be some software which would allow me to draw a simple circuit, specify input voltages and press 'play', allowing me to see how the circuit acted as a live simulation and see charts of voltage at particular nodes over time.

However, despite googling around, I couldnt find such software geared towards visualisation of simple analog circuits...

Happy to look at OSS or propriety software, I was expecting to find something used in academia but no such luck.

Any recommendations?



Wow, thanks - I played with it for hours and promptly signed up for the paid version. Great tool.


MITx's 6.002x hax an online circuit sim. There's also QUCS for the desktop.


Circuitlab is keeping me busy at the moment but thanks I'll check this out.


Eagle and Kicad have a pretty steep learning curve. Diptrace is far-and-away the easiest to use of the "hobby level" EDA packages. They have a non-profit license (for anyone not looking to sell the fruits of their labor) which works fine for most hobbyist boards.


I'd recommend you start by designing shields and/or capes.

Designing a complete system from scratch has three components, the system design, verification of the design, and then bring-up. Its very frustrating there in the middle. If you want to practice take a built system, and the data sheets for the parts, and write an operating system for it. A lot of chicken/egg problems. Things like "Ok see if the HW bootloader is working by loading a loop which toggles this line which I can watch using an oscilloscope." Real, starting from nothing into something stuff.

Designing shields (Arduino) and capes (BeagleBone) gives you a "known" working system so that you can prototype various peripherals you want in the final system. This is what manufacturer evaluation boards are actually for. They have the ability to be configured into all of the supported modes so that you can try them out prior to building them. A number of systems just lift the schematic from the Evaluation board, delete the parts they don't want, add the parts they do, and voila system "design."

Staying below 20mhz is a good idea until you get a handle on what happens to signals as they travel across a circuit board. Working with LVDS (low voltage differential signalling) lines only makes sense when you see what a 50mhz line looks like 4" from the processor on an unterminated 3 mil trace. :-)

Also if you make useful shields/capes and re-sell them then you can earn some money to buy the tools to make the more complex systems. You will also discover that a 500Mhz oscilloscope that is 15 years old is nearly as expensive as a 500Mhz oscillocope that is new because well if it works and is calibrated, then it works. Kind of like machine tools, they have intrinsic value.


Working with LVDS lines only makes sense...

Heh... forget making sense, you can't even tackle some LVDS systems until you start thinking about your "observer effect" when you try to measure signals, i.e. the impedance of your probes...


I've gotten into PCB design the last month or two. It is surprisingly accessible and super fun! I'm using Eagle and have had boards made by OSH Park.

My first goal is a FPGA dev board, not altogether unlike the FPGA module on bunnie's laptop. The board I envision has a USB interface (for FPGA configuration and communication with host computer), a few peripherals and high speed IO header. I broke the project into three steps: a microcontroller board with USB interface not unlike the Arduino Leonardo (built and working), a board with a small (= cheap) FPGA to experiment with soldering BGAs (board just back from the fab), and the final design.

The ultimate chip I want to use is a BGA, which needs to be soldered with a process called reflow. (Basically, you bake the board in an oven.) So as a side project, I'm modifying a toaster oven into a reflow oven with the first microcontroller board as the controller. Lots of people have done this; you can find tutorials online. I started with an AVR chip for the USB, but I'm planning to switch to a ARM Cortex series M chip (STM32F4, only a few dollars more but way more capable) in later boards.

Have fun!


I haven't messed around with those layout tools, but I can recommend this book if you are interested in the actual engineering: http://www.amazon.com/High-Speed-Digital-System-Design-Inter...

(EDIT: fixed wrong link)


That book looks great, thanks! I've been reading:

High Speed Digital Design: A Handbook of Black Magic by Johnson and Graham:

http://www.amazon.com/High-Speed-Digital-Design-Handbook/dp/...

and RTL Hardware Design using VHDL by Chu:

http://www.amazon.com/RTL-Hardware-Design-Using-VHDL/dp/0471...

(the latter for designing for FPGAs)


Sparkfun has a good eagle tutorial - https://www.sparkfun.com/tutorials/108


Yes, and it might be instructive to mod one of Sparkfun's boards - for which they often post the sources.


It's kind of funny, seeing all these articles on HN I finally see one that relates to me :). I remember growing up with Bunnie's Xbox hacks and it got me really interested into embedded/hardware stuff.

Fast forward to today, I'm working on bringing my dream to create a laptop that doesn't stink, good quality, very Linux friendly and easy to use(by making a easy to use Linux distro). I also want to bring privacy and security by default by leveraging open source tools(including encrypted email and chat). There is a lot of room for growth and the niche markets of developers, hackers and future hackers could one day extend into other markets.

Alas my skills in PCB design extend to Motor controllers/drivers. I want to learn how to do High frequency bus design say to make ARM or i7 motherboard designs. If anyone here knows what kind of software to do that sort of thing I'd love to know.


You mean schematic capture and PCB layout software?

- KiCAD: http://www.kicad-pcb.org/

- Eagle: http://www.cadsoftusa.com

Are the most popular entry level ones. You can make an ARM or i7 board with either. I think Bunnie is using Altium which is much more pricey.


See also:

- DipTrace: http://www.diptrace.com/ (probably best bang/buck in terms of capabilities/learning time)

- gEDA: http://www.geda-project.org/ (not a beginner tool)

- Fritzing: http://fritzing.org/download/ (super beginner tool)


And the YC-backed start-up, Upverter: http://upverter.com/


I'm in a similar situation - I don't claim to be an expert, but I have a plan to gain expertise.

Software-wise, Altium is one popular choice (I've used this and I like it; bunnie uses it [1] as does Dave of eevblog) and another is Cadence Orcad (I haven't used this myself, but the SABRE schematics come in this format). These are both expensive; Cadence has a range of options, from a standard version 1 year license costing $1,300 to an advanced version perpetual license costing $9,995 [2] - for Altium 1 year is $1,750 and a perpetual license is $5,495 [3].

Piracy protection used to be weak, and I'm not up-to-date on how easy it is to bypass. Like Photoshop, people used to say the companies turned a blind eye to non-professional users who were learning. This may have changed recently, I don't know.

Anyway, on to my plan to learn this stuff. Bunnie's planned laptop uses a Freescale i.MX 6 processor. It seems pretty feature-packed as a processor. Among other things, Freescale seem to offer some good design documentation; they have evaluation boards [4] where they give out not just pdf schematics and gerbers but also the CAD software files.

They also have extensive design documentation. Public datasheets, Hardware Development Guide, Processor Reference Manual and so on [5]. I've also discovered getting prototypes assembled in china is cheaper than I expected [6] - a few hundred rather than a few thousand dollars.

My provisional plan is to read all that freescale documentation, take one of the freescale board reference designs, then to basically produce a cut down version mostly by deleting components I'm not fussed about, then to get that made and see if it works.

That's my plan - whether it will work or be possible for the amount of money I have not remains to be seen :) and needless to say if an expert comes along to reply to you, maybe they'd have feedback for me too :)

[1] http://andrew.huang.usesthis.com/ [2] http://www.ema-eda.com/Products/CadenceOrCAD/OrCADPCBDesigne... [3] http://www.altium.com/files/pdfs/altium%20designer%202013%20... Altium is one popular choice; another is Cadence Orcad. [4] http://www.freescale.com/SABRESDB [5] http://www.freescale.com/webapp/sps/site/prod_summary.jsp?co... [6] http://www.7pcbassembly.com/PCB-assembly-price.php


I'm working on a project right now that started pretty much the way you described. We took Freescale's SABRE SD schematics and DNIed the stuff that wasn't necessary.

I will say that I like working with Freescale's reference material better than, say, TI's OMAP and AM series. Of course I work for a larger company that has easy access to their FAEs, but at least they have an online forum where you can get answers...sometimes.

You've sniffed out one big problem with the Cadence Orcad layout. Freescale recommends you just cut-and-paste the DDR3 lines directly from their layout. Partially because the timing is so hard to get right and partially because they don't have the bandwidth to help you debug your board layout. So that's probably the biggest hurdle in layout. TI went a different way with package-on-package RAM mounting, but that's another set of headaches when it's time to build boards.

Another nice part is that you can get working boards today if you want them. I'm a big fan of Boundary Devices (http://www.boundarydevices.com), you can get a SABRE-based board for $99. Pick up the kernel patches from FSL and you can have your own system building and running in an afternoon.

The i.MX6 is also a nice evolution of their SoC family, especially now that you can opt out of their power-management IC. But that means a bit more schematic and layout work if you choose to go that route. I'd leave the PMIC in, you also get some nice additional features if you use it.


Freescale's documentation was fantastic. TI's was mostly good, except for whole areas they left out.

Marvell . . . hoo boy. Let's just say that everyone telling us "If you go with Marvell, you're in for a ride" was correct. Just enough detail to get you into trouble. Cheap, though. . . . :-)


Yeah thankfully I never got that far with Marvell. We had their eval platform while also looking at the iMX (well, we just bought a Nitrogen from Boundary. Much easier).

The Marvell platform was just a trainwreck. Even simple questions were met with blank stares.


We had to talk to the actual chip designers to get questions answered. After weeks of asking stuff that just got echo-chambered back.

Us: "We want to know the delay period between freezing the timer and reading it. The documentation says 'three clocks', but we have strong evidence this is not accurate."

Marvell: "We are happy to help with your question. You need to wait three clock cycles."

Us: "That doesn't work. (insert mountain of evidence here)."

Marvell: "Let us get back to you."

(much later)

Us: "Remember that question about the clock register?"

Marvell: "What question?"

(repeat above, several times)

Much later, we get a phone session together.

Obviously new Marvell chip designer: "It says three clock cycles here."

Us: (lots more evidence that that is not the case)

Slightly terrified (but still new) Marvell chip designer: "Let us get back to you."

(much, much later ...)

Marvell Chip designer, who clearly has his act together, but designed the circuit four years ago and can't remember, so he digs into the design files: "Oh yeah. Three clocks, but they're internal, so you can't see them. Use this clock over /here/ and write this, then read that, and you should be good."

Repeat for each major chip feature, and don't get me started on DRAM controllers or USB PHY registers.


Wow, that's pretty bad. I've never had to go that deep with a chip manufacturer. Then again I've been using Motorola/Freescale for 30 years...


I just had an interview with bunnie about open hardware, hardware hacker and what projects he is doing recently, he share a lot of deep thoughts. http://www.bunniestudios.com/blog/?p=3234


Thank you so much for posting - very insightful interview. I know HN doesn't really approve of "me too"-comments -- but this really deserves being submitted as a story on its own.

(I've become more than a little discouraged from my few submissions that have sunk through the front page like a stone -- so I'll just leave this comment here :-)

edit: Be sure to also read the linked "interview that focuses on one of my recent failures.":

http://makezine.com/2012/04/30/makes-exclusive-interview-wit...


If you read bunnies post carefully, this probably isn't the open laptop that most people are hoping for. It's tailor made for an EE that does a lot of high speed digital work and wants an open laptop. RTFM indeed!


Since a few people on this thread are talking about wanting to do board design without prior EE experience I decided to top-post this rather than post individual replies.

This has come-up on HN a few times. As an EE with plenty of experience designing all manner of hardware I'll tell you that, while not impossible, there's quite a road to travel from having no formal training in electronics to designing and laying out reliable working boards. Not to say it can't be done. Anyone with enough drive and motivation these days can learn almost anything with the myriad of resources out there. Possible? Yes. Plausible? Only for the really driven.

There's a demarcation line somewhere in the 25 to 50 MHz range beyond which the design of digital electronics isn't very tolerant of amateur design practices. You really start to need a solid understanding of the underlying science. Unlike software development, it is frustrating and expensive to blindly poke around mid to high speed design in hopes of finding a solution.

Digital circuits in many ways become very analog as frequency increases and clock edges get sharper. They can and do radiate RF and they can and do bounce around the copper traces mucking things up if either the electrical design or the trace layout and board stack-upare off.

Something as seemingly "simple" (in quotes because it isn't simple) as implementing a DDR2 or DDR3 memory interface requires quite a range of information in your mental design database.

For example, you need to start to consider the time it takes for signals to leave the FPGA (travelling at nearly the speed of light) and arrive at the chip. These signals must arrive within a very precise window of acceptance. A window which is measured in nanoseconds. You need to model the signal path from the die through the interconnect, BGA ball, land and via stack, trace and then back up the chain into the memory chip. And you need to do this for every single connection.

While you do the above you also need to look after the integrity of the signals themselves. In fact there's a discipline called "Signal Integrity" (SI) that deals with this. Most large companies have dedicated Signal Integrity engineers. The field is somewhat complex and requires a good deal of experience and study, particularly for mass production. Anyhow, at higher speeds everything is a transmission line. Failing to design with this in mind pretty much guarantees that a circuit that looks absolutely perfect on paper will not work.

SI has a close cousin called "Power Integrity". As operating frequencies increase and signal edges become sharper the power distribution system (PDS) becomes very critical. Once again, not understanding the subject can result on a board that looks great but does not work, or worst, is hopelessly unreliable. PDS design issues can cause resonance and oscillations on the power planes, increase clock jitter, destroy signal integrity and timing and more. This too is a subject that is usually the domain of specialists in larger organizations.

SI and PDS design alone can mean that you can have a schematic that looks absolutely perfect and a circuit board that does not work, is unreliable or has a "personality" such as weird things happening, random resets and other hair-pulling afflictions. SI and PDS design knowledge is of paramount importance as design operating frequencies rise.

That's not the end of it. Digital design with FPGA's also needs to consider such topics as the number of average simultaneous outputs switching per side/bank on the device. This has both SI and PDS implications.

As you approach the GHz range things get even more interesting. Examples of this might be doing HDMI or DVI transmitter/receiver designs or Thunderbolt/PCIe interconnects. Now you are solidly in RF territory, a land with it's own set of voodoo to learn.

As I said above, I draw this imaginary demarcation line somewhere in the 25 to 50 MHz range. Below this line one can hack around and build stuff that works well or reasonably well most of the time. I'll generalize and say that's the Arduino and below territory, the hobby embedded territory. Nothing too critical. Timing windows are comfortable. SI issues are relatively minimal. SSO is mostly not of concern. Transmission line or impedance controlled design is rarely needed, etc.

The minute you start talking about something like a laptop board or high performance FPGA based, well, anything, you enter a domain that is either reserved for very experienced EE's or requires a ton of time and dedication to master as a non-EE. This is also a domain where hobbyist tools tend to fail miserably. I use Altium Designer for electronic design along with FEA tools for SI and PDS designs on high speed boars. I also use a number of custom software tools created over time in order to simplify and facilitate the process. Most companies or designers develop guidelines, rules, tools and procedures over the years in order to improve design yields. Like I said, a perfect schematic does not directly equate to a working board.

I haven't even covered topics such as thermal design and regulatory testing. That's a whole other layer of knowledge that is critical to have depending on what you are doing.

Making hardware is very different from writing software in that the iterations per unit time are both slower and very costly. You can sit in front of your computer and hack for hours and hours, fail, edit, compile, learn and it costs nothing. Not so with hardware. You really can't learn that way. Even if you had money to burn, at one point you really need to understand the science behind it all in order to not burn cubic time learning by trial and error.

Even freshly graduated EE's are not necessarily qualified to successfully approach high-speed design. It takes years of learning and dedication beyond school to really master the art. This is not a matter of software. Even if I gave you a license to $100K worth of EE design software (yes, it can get that expensive) and fully trained you on it's use you would not be prepared to design high speed boards with it.

Again, none of this is to say this is an impossible domain for non-EE's to approach. Just pointing out that it is an iceberg. What you see is what's visible above the surface.

Where to start? College Physics and Calculus. Basic Electronics. Transistors. AC, DC and RF Circuit theory. Buy kits and breadboard-based trainers and build circuits. Make lots of boards. Learn. Look at the set of courses required for an EE degree and learn as much of that as you can (and in the order they are presented). Reading books about signal integrity right now without the required foundation isn't a good idea. It's like trying to read a Chinese book on Chinese literature without having learned the Chinese language first.

Good luck.


A thousand times yes. Set your expectations accordingly and then go for it. I've compared laying out a high speed board to the equivalent of pulling out the Linux scheduler, rewriting it to use an atomic clock for systick and re-inserting it into the kernel such that every distro still works :-) That said, the really cool thing about "System on Chip" or SOC devices (like the one in Raspberry Pi or on the BeagleBone) are that most of the hairy high speed stuff is inside the chip and you're only worried about the low speed external stuff. That is much more doable.

I started laying out a "small" Zyng board and running though the DDR3 timing calculations for each signal line was insanely complex. I was like "seriously?" It explained board layouts where the memory traces are all made the exact same length even if that wastes board space.

That said, if you're interested in pursuing this and are willing to spend some time taking online training and building a number of exemplar circuits to test out your understanding.

Start at 10Mhz and work your way forward to 200Mhz.


This guy hits all the design concern bullets I can think of, and he writes well. Listen to him.

Also, get some PCB quotes before you try to jump feet-first into learning high-speed design via rapid prototyping. The cheapest boards I've ever ordered were 2 for $80; the last board I made with even the slightest eye to density/PDS/SI I think was 4 for $500. That was a dinky 2"x2" or 3"x3" 4-layer board too, none of the crazy 24 or 32 layers that laptops have.


Where are you getting your boards fabbed? I literally just got back 3 2x3" 4-layer boards with 5mil min trace/spacing from OSH Park for a total of $54 (shipping included).


OSH Park does pooled PCB orders; didn't have time to wait for an order to get together. You can definitely save money if you don't mind waiting up to a month for your board. Although, perhaps as pooled orders have grown in popularity, the delay isn't as bad these days?


They do two 4-layer batches a month, so yeah, the turn around can be nearly a month if your design is ready on, say, the 3rd of the month. I think the do 2-layer board runs every other day now.


This is pretty much what I've gathered from a friend that works with custom wifi stuff[1] (eg: real-time positioning based on wifi, with a custom reciever chip). They do some mil-spec communication stuff too.

What I really like though, is the trend that the rasberry pi is part of (and other boards/SoCs before it) -- it would be awesome to get some reasonably cheap completely open designs - both boards with I/O ram and something like a few tested FPGA with something like the LEON[2] "cpu".

I'm not sure how feasible it is -- and I'm sure it would be great for learning and experimentation. Not everything needs to be high speed -- the Amiga could do a lot of cozy graphics and audio, and snappy GUI -- all with a 7.5 Mhz cpu. There should be some room to play in the now "ancient"/"slow" space...?

Basically I'd love to see some mini/nano-ITX like boards, with passively cooled FPGA, with BIOS/init via coreboot running various open OSs. The rasberry pi ticks a lot of these slots of course, but it still has a few proprietary parts in there (then again, it is dirt cheap too).

The main thing I see missing, is something that is accessible from the ground up (maybe something even slower, along the lines of the C64/VIC20 might be even better).

[1] http://www.radionor.no/ [2] http://en.wikipedia.org/wiki/LEON


Great comment.

As a programmer who's had to go through a lot of the learning processes you describe, and I can't agree strongly enough with your informative cautions on the challenges in PCB design, some here might be interested in this document that is merely an indicator of some of the challenges you mention.

http://www.bluechiptechnology.co.uk/~bluedownloads/Single_Bo...


Great response! I would like to suggest that as people choose to go down this path (and if it interests you - do it succeeding at challenging things is rewarding), a great way to contribute to the world of "open *" is to codify your lessons into better amateur tools. One of the problems robomartin lists is the lacking quality in amateur tools - and the obvious solution is to increase the quality of them!


The "qualified buyer" requirement is interesting. I understand the intent but wonder how it would work in practice. The linked blog post suggests using a program that needs to be modified and submitted to inquire. A qualifying test like that seems too easy to cheat. I imagine the modified program would quickly become available online for anyone who cares to look.

What tests could you design to verify a hackers proficiency and acknowledgement of the product's quirkiness that's practical to verify but can't easily be cheated?

Perhaps when you initiate an inquiry you could be given a unique function that makes a set of randomly generated non-destructive transformations to a string. Your task is to submit a string that will be transformed by that function into your email address. If not your email address then the output string "I will not ask Bunnie for help, I will fix problems myself and share my fixes"


>but can't easily be cheated

does it really matter? the goal is just to discourage mass market consumers. if somebody wants one so badly they're willing to cheat on a test to get one, that isn't really mass market.

and isn't cheating sort of the spirit of hacking that projects like this want to encourage? if you want something, you make it happen without worrying about the "right way" to do it. nothing wrong with a bit of cheating - this is reality, not school.


Don't underestimate mass market consumers' readiness and ability to use a search engine. The small effort to find the answer online will not deter anyone who comes along wanting a laptop.

What I have seen called cheating by hackers are things like exceedingly clever shortcuts. Cheating by using someone else's answer in a qualification test is not in any sort of spirit of hacking that I'm familiar with.


  And probably, it will be priced in accordance with what 
  you’d expect to pay for a bespoke digital oscilloscope...
Uh oh. That's the $5K range.


A mid range oscilloscope can easily cost $5000. If you look at the whole range of oscilloscopes, $5000 is on the very low end. If you want bespoke, as in custom tailored to your exact specifications, that's easily another order of magnitude in price, possibly two.

Heck, there are oscilloscope probes that cost $5000.

But test equipment isn't exactly high volume stuff.


What call would one have for a bespoke oscilloscope? Surely the whole point of test equipment is that it's standardised?


I'm not sure what you mean here... you want test equipment to be "standard" in the sense that they all agree what 1.0 volt is, but different people can need radically different feature sets from test equipment, as evidenced by the fact that Agilent makes one hundred and fifty three different kinds of oscilloscope, not counting the spectrum analyzers, logic analyzers, signal analyzers, frequency counters...


I can understand wanting more channels or a faster sample rate. And maybe a portable/handheld version. But what other differences are there?


- Ability to do some scripting, or integrate it into a product/project.

- Webserver/standalone operation.

- Integration with other standard comm buses.

- Option for Better A/D (typical o'scope is 8 bits).

- More/better output options. You could potentially use it to simulate some hardware (in the loop).

- Ability to add COTS hardware.

- Open design insures owner against planned obsolescence.


This is kind of like asking why there's more than one model of car on the road. Other than wanting to carry more people or go faster, what differences are there? Any automaker could just produce three vehicles: a sports coupe, a panel truck, and a bus. Maybe a motorcycle (again, just one model).

Depth of the sample buffer. Tradeoffs between sample speed and sample resolution. Triggering. Math (anything from sums/differences to FFTs). Bandwidth, including tricks like equivalent-time sampling. Various kinds of display. Various kinds of human interface. Various kinds of machine interface. Extra features like waveform generators.

Honestly you can figure this out yourself: go to some scope makers' websites (Tektronix, Agilent, B&K, Fluke, Instek, Teledyne Lecroy, etc etc) and see how their marketing department distinguishes their product lines.


Could this be Stallman's new laptop?


Yes, but it shouldn't come with a parrot just because he likes parrots.


No, because it comes with a proprietary GPU.


What an amazing bit of work .. I hope to have one of these things when the time is right. The inclusion of the FPGA onboard makes a lot of sense, and would be a real selling point, in my opinion, for a lot of hackers. I can think of an endless vista of applications for that FPGA .. wow! An Amiga in your laptop! :) Hope I can qualify for one when the time is right!


Finally, A router without a backdoor. Nice!


eg. this change made to reduce the possiblity of a hostile HDMI device corrupting the firmware: http://www.kosagi.com/w/index.php?title=Novena_EVT_to_DVT_ch...


I'm displeased with the association that open means unapproachable by all but the chosen, but the project is indeed worthwhile and maybe, just maybe, it can be honed to a level that a normy could use it. :)


I don't think people realize that we're standing on the shoulders of giants when we use most of this current technology.

The processor, for example, has a 5,000 page technical reference manual[1]. That's just describing how to use the chip. It doesn't go into detail about how each part works internally. But to the layman it's just a 2cm^2 chip so how complicated can it be?

Everyone wants the sausage but nobody wants to learn (or even know) what goes into it.

[1] http://cache.freescale.com/files/32bit/doc/ref_manual/IMX6DQ...


The limitations bunnie was talking about in terms of sales aren't about technical elitism, but rather a practical decision of not wanting to be pulled into a huge support nightmare when it is almost literally just two guys in a garage working on this as a side project.

This is something he is right to be concerned about. No matter how many warnings you slap on something that it is currently only suitable for advanced users, you will always get a bunch of people who want some shiny new thing and buy it and then bitch and moan that it doesn't have Apple-style end-user polish.

It is all being developed in the open (in all senses of the word), so if some other entity who does want to get into the mass manufacturing and support side thinks doing so is worth it, that can always happen.


Displeasure aside, having access to the source directly benefits people who can read and edit the source.

In the long run this benefits everyone, of course, but it's only the techies who can actually use the source, and that's why it's mostly the techies who care about it being open.


Sounds like an useful board for uses other than a laptop (e.g. robotics, DSP).


This is awesome!! Can't wait to get my hands on one actually.


Sounds like the motherboard I need to finally make that tricorder.


http://yaha.me HN a better interface just check it ;)


spam




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: