Hacker News new | past | comments | ask | show | jobs | submit login
Oh, Adobe... read the copy, then view the source. (adobe.com)
143 points by josscrowcroft on Aug 18, 2011 | hide | past | favorite | 146 comments



Please, tell me this is Adobe trying to mock IE. It has to be, right? I mean, they've always sort of not valued source code, but this is beyond torturing it.

This is so inspirationally horrible someone spent hours remaking it, like it should've been: http://studentweb.infotech.monash.edu/~wlay/FIT1012/muse-dem....


Indeed. The inefficiency here is astounding, especially considering the relatively straightforward page layout involved.

Original: 1496 lines, 77.9kB

Your version: 104 lines, 4.75kB

I'd thought we'd progressed beyond the state of a decade ago where Dreamweaver or what-have-you would build you a cumbersome and baroque html splooge to match whatever you had done in the designer, but I guess we haven't advanced that far. Just goes to show you that front-end devs are still as necessary as ever I suppose.


How about we've progressed to the state where HTML is the bytecode you don't want to see anyway, and designers can use modern tools to manipulate it? If the generated markup works, cross-browser and cross-platform (I don't know to what extent it does, but let's assume so), then what's the problem?

For many purposes, optimizing the HTML nerd out of the process is a much bigger win than a 20k download (don't forget gzip) is a loss.

I know this is going to get me downvotes, but I think the dogmatic "HTML shall be written by hand!!1" attitude all over this thread is just people clinging to the past.


In this case it's a 78k download, of just text. That adds up over enough users. It adds to page load times, it adds to page rendering times, it adds to bandwidth costs.

And then there are the higher level issues. What happens when you get a bug report about how the page is rendered in a specific browser/OS? Do you want to wade through 1500 lines of html or 100 lines? Which do you think will be easier and faster to fix? Which do you think will be easier to inspect for correctness from the start? What happens when you need to figure out why your page is rendering too slowly? Which is easier to analyze, which is easier to speed up? What happens when you want to change the design? What happens when you want to take the design and use it as a UI for a web-app?

Using a tool that generates such crappy HTML may allow an inexperienced person to create a web page with a decent appearance, and it may even save an experienced designer a few minutes upfront. But over the lifecycle of a project it ends up being an enormous drain. If you're an enthusiastic teen putting up a web page for your mother's knitting circle, it's fantastic! But this is not in any sense a truly professional tool.


Quite. It all adds up and it's unnecessary. It's not green. It's wasteful. It's not elegant. But I fear it is the way things are going, because I've seen this before -- I used to write games in lovely, pure, beautiful assembly language, then folks started using C etc because it was easier and, hey, hardware like the Amiga could still run the games fast anyway. And then the hardware got to a point where it was all so complicated only a madman would contemplate assembly language...


I think you are not considering the many websites for which this inefficiency doesn't matter. Sure, if you're write Google's main web page, every extra byte counts. But there are not many web sites where this is even close to be true.

For example, there is no big difference in functionality if the Adobe web site takes a few extra milliseconds to load. Customers just want to buy a product and get out of the way. And Adobe is a big company... This is even more true for their customers, thousands of small websites that just want to publish content as quickly as possible.


We aren't talking milliseconds, we are talking seconds to 10s of seconds. At that point, you are risking bounces, which any business-oriented site should fear, even Uncle Bob's Burger Bar.


You're forgetting that when several thousand people access a site per second, even a 1kb difference means thousands of dollars of bandwidth cost and fractions of a second of download speed.... let alone 70KB of difference which would bankrupt your company and turn all of your users away.

We don't write HTML/JS/CSS by hand because it's fun, that's for sure. We focus on code-reuse and delayed loading plus AJAX (plus gzip, compress, CDN, cache, etc.) because the customer focuses on speed and the CEO focuses on the bottom line.


This.

And, not to mention no business owner want to get locked into a solution, they want their "data" transferable and standardized to at least some extend.

The code this produces is so horrible you've lost all the time spent with it. Muse dies out, and your time spent in it dies with it.


  And, not to mention no business owner want to get locked
  into a solution, they want their "data" transferable and 
  standardized to at least some extend.
That's a weird thing to say. Why do many business continue to use things like MS Office - or rather, its fabulous file formats then? Or other vendor-lock-in products? Don't misunderstand me, I'd wish all businesses would think like this, because open standards are infinitely superior to vendor-specific file formats. But unfortunately, most don't.


Maybe - but nobody uses Front Page, and for good reason. The tools need to reach some critical point of utility and dependability before business will make single-vendor investments.

Still, it absolutely defies reason that a business would need to hire someone specifically to edit a bit of text or change a font on a web site. Once a credible product along these lines hits the market, the terms HTML and CSS will disappear from the overwhelming majority of job descriptions, forever.


That's only weird to those who think open standards to be the only viable standard. MS Office is the de facto standard and your data is not vendor-locked when using it. Please provide link to a somewhat decent Word/Excel/... alternative which doesn't at least let you read the document and save it in their own format.


  That's only weird to those who think open standards to be
  the only viable standard.
Open standards are the only viable standard if you want adaptable, future-oriented and collaborative software ecosystems as well as likewise markets. You simply cannot guarantee or even create these circumstances with standards that are set by a single corporation (or worse, a trust) - which is only logical because they were designed to do the exact opposite ('defective by design').

  MS Office is the de facto standard
I wasn't arguing that. My point is that this is bad and needs to be replaced.

  your data is not vendor-locked when using it
I think you somewhat misunderstand the term 'vendor-lock'. Sure, you can open Office files with other programs and convert them into open file formats, such as odt.

This is, however, mostly thanks to people reverse engineering Microsoft's original binary file formats, and MS was not really happy about this to begin with. If they could have prevented it, they would have done so (and they tried). Even the newer OOXML is not entirely documented and prevents free implementations due to patents (which, no matter what Microsoft may claim, is the exact opposite of an open standard).

Also, while this conversion might work fine for simple, small documents (or other files), the more complex and larger your filed become the more impossible it becomes to convert without a major hassle, which brings us back to your misunderstanding of 'vendor-lock'. The terms doesn't necessarily mean that it's impossible to switch to alternatives, but also applies when measures are taken to make it as hard as possible to switch without investing heavily in time and money.

As a side note, I am not attacking MS Office specifically. It's just the best example for showing all that is wrong with closed standards and proprietary file formats.


I'm not preaching morals, I'm taking business. Business' care not for fair software but their investment which is why I used the "somewhat open standard"-wording.

In the history you're clinging to you completely ignore that there was no real alternative. Open standards weren't in a viable state. Furthermore the competition from closed standards have forced open standards to shape up.


Completely off-topic, but - like it or not - business runs on Excel. I certainly don't because I have to deal with it on a day to day basis. There's simply no convincing them to move to another format. Excel is the defacto "move numbers and data around" file format in the organization I work for.


If I had to bet, I'd say most companies using a product like this are not looking to build a site that'll be around for years on end. Most of these sites probably have a shelf-life of two years at most.


It's not a matter of clinging to the past.

I definitely want and wish for tools that can allow me to design websites without having to know what IE developers were thinking the day they decided to have some fun.

The reason vim, emacs and the likes are still used a lot isn't because of nostalgia but because it works great for the user.

The issue with all the WYSIWYG editors is they take the "a web site can be anything and everyone can be a designer"-approach which is completely make believe. Till they make some form of WYSIWYM editor aimed at the web designer to use, all other approaches are worthless because they are basically just Word that's fancied up.

Most graphic designer classes have a field trip to see how typographers worked for newspapers 60 years go, and learn why their job was important - and such a day will come for designers some day too when it comes to writing code in hand.

But till then, design and code still matters.


Mobile users account for more than 15% of my company's corporate website (for reference purposes, IE is 26%, firefox 35%). Making a webpage 10 times bigger/taking 40 times longer to render isn't really an option.


Downvotes are deserved. You may not want to see HTML, but the thing is, it impacts performance of the page. Client side performance has direct and measureable business impact. Also, let's not forget, that more and more people browser the web on their mobile phones. Some still pay a lot for bandwidth or have a traffic cap, so every byte counts there. Also, so far mobile phones are pretty restctricted in resources, and if you go over some limit compontents of your page won't be cached and browser will have to redownload them, then see the first point. If you use any kind of DOM manipulation having gaziilion of unneeded elements will slow script down a lot. Once again, on the phones and tablets that is even more important than on desktops. Others already mentioned accessibility. There should never be ecscues for a sloppy code not to metion a horrible code like this.


First of all, it takes almost no time at all to learn HTML/CSS. I'm willing to bet that it takes just about as long as it takes to learn how to use all the controls in Muse. Supporting IE6 is the most time consuming aspect of cross-platform web support, and there is plenty of precedent for that available freely online. ASM is a not so easily learned and understood as HTML/CSS.

Second, HTML is for telling the browser how to lay out content. When you use Muse or Dreamweaver or iWeb or whatever, you're essentially "scripting" your HTML in a proprietary GUI. When that GUI changes or disappears, how will you maintain that page? Yes, by hand.

All WYSIWYG GUIs should seek to output human-maintainable code, at the very least. It isn't a performance issue.


And how much is it compressed? Let's see...

  ~ $ curl -s 'http://studentweb.infotech.monash.edu/~wlay/FIT1012/muse-demo/' | gzip | wc -c
    1741
  ~ $ curl -s 'http://muse.adobe.com/index.html' | gzip | wc -c
   11521
Hmm, so 11.5kB vs 1.7kB with gzip, which I believe browsers can usually handle. That's a factor of 6.6, incidentally. I don't know much about this, but might it still be ok?

I thought to check this because of ridiculousfish's old article (note the FAQ at the bottom, "Isn't that a humungous flood of markup?"): http://ridiculousfish.com/blog/archives/2009/06/01/roundy/#f...


Keep in mind that those connections are all going to be stuck in TCP slow start (gradually improving) for the duration of their communications, meaning that the smaller version gets loaded in 2*RTT, as it will probably fit, in it's entirety, in 2 packets. The larger one is going to take 8 or so packets, meaning a lot more RTTs (probably at least 4, assuming aggressive TCP tuning, and possibly 6. Since RTT can easily be 120ms, these can be substantially larger load times, and that can make a huge impact to user impression:

   Even small changes in response times can have significant effects. Google found that moving from a 10-result page loading in 0.4 seconds to a 30-result page loading in 0.9 seconds decreased traffic and ad revenues by 20% (Linden 2006).
edit: It has been pointed out below that rfc2581 is going to mitigate this somewhat, and they are absolutely right, although I don't know what the implementation levels on this are in the real world, so my observations above may be obsolete for newer OSs.


Basic RFC 2581 should hardly count as an aggressively tuned stack, and even it'd do the larger file in 4 roundtrips (syn/synack, req/2 packets, -/4 packets more, -/8 packets more). These days even an initial cwnd of 10 might not count as aggressive any more, given it's the default initial cwnd on recent Linux kernels...


That factor is exacerbated on a smartphone. Plus, some of them pay for bandwidth.


As long as tools like Dreamweaver exist, there will be a need for good front-end developers to clean up the mess.


What a shameful waste of a good front-end developer's time.


"What a [lucrative] waste of a good front-end developer's time."


You couldn't pay me enough to _maintain_ that mess. Not even a unit test in sight if something breaks.


Hopefully this issue will get back on the table with increasing mobile internet usage, and people bouncing off because of poor load times. I cant count the number of times I have to hit the back button just because it takes too long to load.

I really do not want to see that 1mb picture of your dog mascot, but i was actually interested in your 5kb product description... Oh well if you dont care about your site, you probably dont care about selling your product either.


P.S. The big background image is only about 49kb...


To add insult to injury, the layout is broken in chromium on Ubuntu. Whereas the remake you posted is running as expected.

http://i.imgur.com/otaOc.png


Speaking of which, "Adobe Air is not available for your system." :-P


True, I have not come to expect anything else, so I run a WIN7 virtual box, no biggie.


Dreamweaver required you to know quite a bit about HTML. Muse is like InDesign/Photoshop, which people use to do mockups of designs (not UIs/wireframes) today. It cuts translation to code from the workflow. That's a meaningful shortening of the feedback loop for a designer.

As for the rest of us, we don't have to be concerned till Muse starts learning the quirks and features of CSS faster than ourselves.


The original does have at least one advantage: preloaded hover images. It's nice for them to improve usability a tiny bit, along with all the bloat. I can't imagine a complete newcomer to HTML and CSS hacking together a page and having it come out nearly as bad. Any designer who can learn design can easily learn to code better than Muse, in my opinion.


Sprite sheets?


I can't tell whether you're suggesting a solution or taking a guess as to what Muse did. I think it's the latter. Sorry if I interpreted your question incorrectly. :)

They're using a hidden div full of <img> elements to load the hover images before they're requested by an actual hover. It's all the way at the bottom of their code code: <div class="preload_images"> [Removed for brevity] </div>

Sprite sheets are another option (using the sliding doors technique), but they're a bit more ungainly. They would save a couple HTTP requests, but that extent of optimization isn't necessary on most sites. Unless I've already combined all my stylesheets into one file, I certainly wouldn't start combining images.

What really matters is perceptible lag to a user, and either technique work just as well for that.

I find Adobe's technique kind of neat, and I'll probably use it in some of my future websites.


The windowing technique isn't ungainly when its done automatically for you a la SpriteMapper :)

http://yostudios.github.com/Spritemapper/


Or you could inject them into the Dom so they load, but don't exist anywhere on the page..


Thanks for that. I know a little about ancient-style table-layout web dev, but didn't know what the "proper way" of laying out that site would have been.


Seriously, you have to learn how to use css sprites...


I recommend reading this post by Zeldman from last year: http://www.zeldman.com/2010/07/05/an-indesign-for-html-and-c...

    Says Nack:

    As I noted the other day, “Almost no one would look
    inside, say, an EPS file and harrumph, ‘Well, that’s not
    how I’d write PostScript’–but they absolutely do that    
    with HTML.”


There's a good reason for that - clean HTML is not just the territory of pedantic/obsessive designers, it has tangible and important benefits to many sites (note: many, not all).

Sure, if you want a personal homepage, or put up an ode to your dog Scruffy on the internet, tools like this may very well work for you. But in those use cases, we've already had capable tools for years, many of which produce cleaner code than this.

For anything bigger/more professional, dirty/bloated HTML/CSS means a few things:

- Bigger downloads for your visitors, wasting their bandwidth. But whatever, you're not paying for that, right?

- More data transfer for you and your host. That you do pay for.

- Increased latency (sometimes massively) and decreased accessibility for people with slower connections. Given the awesomeness of American broadband performance, that means most of your customers/visitors. More latency = more bounces = fewer visitors buying stuff from you, reading your ad copy, etc etc.

- Increased rendering times. See: bounce rate.

And these aren't negligible effects. The difference between a 10KB and 100KB file is very significant, and you don't need millions of uniques a month to find out the difference.

Dirty EPS files are not the same thing - because unclean PostScript suffers from none of these deficiencies except bigger downloads - which for EPS files has little to no consequence in the typical use case.


And then, on top of all these fine points, you'll get into trouble when you want to do something outside of what the tool provides. Then you need to touch the code, which is a horrible mess; with code this bad you'll want to kill yourself.


Just like PNG output from Photoshop, soon HTML+CSS output from design applications will beat most humans, or at least be reasonably close so that this complaint doesn't apply any more. It's not very hard to do a lot better than the source code of this web site.

Will Google design their homepage with Muse? No, they want to hand optimize it. Will people design websites for small businesses that get accessed a handful of times per day with something like Muse, when its output gets better? Yes. It might not make nerd-sense, but for a lot of sites it does make business-sense to have an increased page size in return for saving a lot of design and development time. Think of it this way: of all the things a person designing a website for a small company might spend their time on, is reducing page size the most profit increasing activity?


That's a good analogy, because until recently, I was writing all my PNG files by hand.


PNG output from Photoshop is just as bad as Muse's HTML (PNG optimisers cut "Save for web" sizes in half).


PNG optimisers are still automated though. I think parent was referring to hand-tuning the parameters yourself.


For me, this is really about code generators in general. No need to center Adobe out here, as I can't think of a single IDE that doesn't generate all kinds of extra code that, to me, is useless 90% of the time.

You also should keep in mind that Muse is a beta and the Muse site itself was actually written with Muse.

In this case, I think Adobe deserves the benefit of the doubt. They're trying to showcase this product with hopes that people like us will come to the table and give constructive feedback rather than point to the past and say, "See, I told you! Damn Adobe!".


If you don't think this is constructive criticism, don't ever build a product in a space largely influenced by engineers. You need thick skin to be a PM and get feedback from engineers.

Adobe could be learning a ton about their product from this response. If they say to themselves, "Those Developers just want to say, 'See, I told you! damn Adobe!'", they will have wasted a major opportunity to better their product.

Their target market is obviously not developers, and there is a lot of feedback here i would just chuck as interesting but meaningless to the product. There is some, though, that will directly impact whether an engineer will laugh when his non-tech friend asks about Muse or shrugs and says "It's not perfect, but if your needs are light, it'll work" (just like I was telling my father-in-law Dreamweaver would work for the genealogy stories cd he wants to make for the family).

Adobe needs to get the product to the second place.


There are worse things than the technical ones if you write bad HTML. Just two keywords: - accessibility / barrier-free - search engine optimization


Also: clean HTML enables hackability.


Good article, except for this minor point:

> Certain kinds of human creativity and expertise cannot be reproduced by machines. […] [machine's] music can never be the Eroica or “This Land is Your Land,” because there is no algorithm with the creative and life experience of Beethoven or Woody Guthrie.

Of course, I agree with the practical point where no current machine can do human art. Because of that we can't currently automatically extract the semantics of an image, or even convert a post-script document into clean HTML. So, it doesn't affect the conclusion in the foreseeable future.

But one can't seriously believe there's no algorithm behind an artist's art without believing in some kind of ghost controlling her brain, and that ghost somehow doesn't run an algorithm. As far as I know, there is no such ghost. It very much looks like our cognition (including our art), is entirely the product of physical processes, even though it definitely doesn't feel like it.

Now, I reckon art is not just the product of some internal algorithm, running in isolation from the rest of the world. We're highly interactive beings, and our output mostly depend on our input. But there is some kind of algorithm which does all these interactions, though it is likely incredibly complex.

My point is, I wouldn't loose hope of automating something that currently, only humans can do. Take spam filters, for example. With very little knowledge, they can take out spam with stunning accuracy. But if no-one told me about Bayesian filters, I would likely try to make the computer parse the whole e-mail like a human, then give up, thinking that only humans can understand those e-mails well enough to filter them (note my mistaken assumption that the spam filter somehow must acquire some high-level understanding of the e-mail to do its job).


Didn't someone write an algorithm for music generation that produced pieces styled after classical composers that actually did have their style?

I can't find the page right now, but it had samples and they sounded quite good.

Maybe this? http://artsites.ucsc.edu/faculty/cope/experiments.htm


No, clean HTML semantics still matters a lot! Forget about programmatic efficiencies like crawlers and page load times. Think about the poor web dev who is handed this steaming heap of bad markup like so:

Management: "We need you to change all the buttons on this web site and enlarge the logo."

Web dev: "OK, these pages were puked out of Muse. Do you have the Muse project files and a copy of Muse for me to install?"

Management: "Muse? What's that? No, we don't have the source files. Can't you just edit the page?"

Web dev: "Sure. After I drink this bottle of Scotch and look for other job postings."

So guess what? HTML is still source code not a machine language.


So guess what? HTML is still source code not a machine language.

Your metaphor would be great, if it wasn't for all the times developers (myself included) have been asked to dig into assembly or bytecode to fix something when the higher level source file had been lost to the ages.

Compared to those, editing machine generated html is a treat.


Somebody else mentioned the benefits of semantic HTML for saving bandwidth / crawlers.

I'm going to mention something else: fluidity.

It is very easy to patch together a good looking design in Photoshop. But the Photoshoped-content does not move. The window cannot get resized. The divs are not flowing, they are not resized or shrunken as the user moves the window handles.

Also, while CSS has many flaws, it does provide separation of presentation from content.

Many authors prefer writing entire books in Latex. Do you know why? It's because of the What-I-See-Is-What-I-Mean style of writing, which is far better as from a single Latex source you can generate HTML, PDF files and optimized ePubs, for whatever medium you desire and have it look exactly as you intended it to look.

HTML is also about developing input-interfaces.

I did work with Delphi, also with the editor in Netbeans called Matisse, which does a kick-job at defining Swing interfaces. Also worked with Visual Studio and with Adobe Flex Builder -- nothing comes close to the ease with which I work with HTML.

Heck, I'm working on a desktop client right now that just embeds a webkit view with native-hooks, simply because I don't feel like learning yet another sucky GUI-toolkit that will disappoint me in one way or another.

Also, I started my webdev carrier by using Adobe Dreamweaver. It was one of the dumbest things I ever did.


I find your largely contextless restatement of the quote a bit odd, when the actual post is quoting that from somewhere else, and explaining why this is so (when out of context the quote seems to imply that this shouldn't be so).


it's a teaser not a TL;DR; read the context.


Perhaps a bigger problem than maintainability using external tools (since one could continue to use Muse to edit the page) is performance. The page is slow to render from the cache in Chrome on my not-slow laptop, uses 153M (compared to 35M for the alternate code elsewhere in the comments) and requires the transfer of quite a bit of data (still significant on mobile connections).


Back a number of years, people were actually very concerned with what was in the EPS or PS files. Preflight before publishing was critical to make sure the files didn't overload the printer. A screwed up print run was pretty costly.

This was with a single vendor spec that was fairly comprehensive. The web is a loose spec with multiple vendors.


This software, Muse, was not used to build that website.

The software used is Business Catalyst (http://businesscatalyst.com/), an enterprise CMS (all-in-one solution urgh) Adobe bought 2-3 years ago. And yes, the semantics are very bad.

In fact, I disliked this piece of software so much, I wrote a post about it a few months ago: http://arnorhs.com/2011/01/19/11-reasons-why-business-cataly...


Hi Arnorhs, I am Alexandru working for Adobe, and I recently took over Business Catalyst.

We did read your review and took it at heart - it wasn't easy as you were very direct. However, there is a lot of truth in your review and there's a lot for us to improve. And believe me, that's our goal - to improve the product until customers love it and we have folks like you give us much better reviews :) - because you would actually like the product.

I'd love to chat with you and try to show you where we're going and get your perspective on what should we do differently. Your feedback would be very important for us.

If you're interested, please contact me at acostin@adobe.com

Sincerely,

Alexandru


I'm sorry if I hurt somebody's feelings (though, that was not my objective).

There were a lot of positive articles on-line when I was evaluating BC. But I later found out that most of them are written by various BC-partners (a good business decision there) - but that also meant that they were a bit biased and skipped over the bad parts.

I'll send you an email. I'm pretty sure Adobe is able to make it better (I'm an avid adobe fan mind you)


This. Is one of the things I love about HN. When you can write a critique of a product, and hear from someone directly involved in it, that's a fantastic thing. When they're a fellow-HN reader, even better.


Why not CQ5 or ADEP?


The only thing I see is that the jquery is served from a CDN address that has catalyst in the name. Where else do you see something pointing at Business Catalyst?


Also at the top there's this comment:

  <!-- BC_OBNW -->
also see: http://twitter.com/#!/bc_obnw

And if you view a screenshot of the html of the top of the page and compare that with the BC website:

http://cl.ly/2g2w0I3w0L1m1R1j1Z1O vs http://cl.ly/0m3Y0s0J3r0d3H3h4626

When you think of it, it is very possible that the website's theme/html/CSS was built using muse at one point, but then it might have been transformed into a BC theme/template


This is correct, the site is hosted on Business Catalyst, but the generated markup is pure Muse.

BC can host any HTML files, we don't enforce a specific markup or template constraints.

Alexandru


That is probably the case. Probably all of Adobe's sites are managed by Business Catalyst.


Scroll down to the bottom of the page, it states: "This site built in Muse (code name) by Adobe®".


Particularly amused by <p class="paragraph">


OOCSS says to never specify elements in the CSS, right? :)


People on HN love ripping on Adobe. I agree that the code is gross, but can we get some constructive criticism? What's a better way to programmatically write good HTML?

I'm willing to bet everybody on this thread writes their HTML by hand. How many of you have a framework for programmatically writing HTML? How many have a framework for writing HTML as broadly as Muse can?


WYSIWYG for HTML is just never a good thing. You spend more time learning the idiosyncrasies of a crippled GUI when you could just get busy in a text file.


I agree! I also think compilers are ridiculous. You spend much more time tuning their output and learning their arcane options and switches and pragmas than you would need just writing ASM by hand.


If modern compilers were as inefficient and horrid as html design tools were a lot of people would code directly in ASM.

Also, HTML is not ASM, it's a very high level language. The ability to code a scant few hundred lines and nevertheless have that become a full-featured, rich, beautiful, and complete layout for a web page or a web app UI is tremendously powerful, and it's no wonder that high end web devs and designers take advantage of that. Until web design "compilers" approach anywhere within spitting distance (even within a factor of 2) of hand coded designs in terms of maintainability, cross-browser compatability, performance, and code size that's not going to change.


> If modern compilers were as inefficient and horrid as html design tools were a lot of people would code directly in ASM.

Well, compilers may be good for languages such as C++. But for most languages such as Python and Ruby, the code executed is just plain terrible compared to ASM. I don't see anyone writing Ruby code and complaining about that waste, though...


People make the same mistake again and again with code generation and this goes back as far as CASE tools and Microsoft wizards. The key thing is that you should never use anything where you cannot "debug at the level of the abstraction" as Dave Thomas puts it (Smalltalk Dave, not the pragprog guy). That's why the analogy with a compiler breaks down: You never have to debug the assembly or look at it for any reason except if you are doing real embedded stuff and even there if you end up having to look at or debug the generated assembly frequently then it would be better to work exclusively in assembly. i.e. Its hard enough to map a program into your head, doing it at two levels is impossible to do without making a mess.

People (including me) like to delude themselves though. "I'm not a web/database/embedded developer. Here's a tool that will give me a quick win". But in the end because you find yourself debugging the mess that it generates you have to rewrite by hand anyway meaning that in the long run a quick win turned out to be twice as long as just sitting down and learning how to do it. BTW I think that DSL's could be the next thing that leads people down this path. A great idea but...


Honestly, I don't see how WYSIWYG GUI front-ends for building HTML are at all like compilers for high-level languages. Care to elaborate? I think your analogy will start falling apart pretty quickly.


Your argument is against haml, not a gui.


Oh well I also think vi is ridiculous so there :P


Reductio ad absurdum? :(


Why? WYSIWYG works perfectly well in InDesign (which is absolutely awesome, by the way), why can it never work with HTML? I don’t really understand.

Current and past implementations are or have been bad, sure, but why can’t it work in principle?


It's hard (not impossible) to make it work in principle because a printing press doesn't need to infer meaning from the post script or whatever you provide it. A browser does need to, so you'd need to layer the extra semantics on top of your WYSIWYG editor.

Which means asking questions like "this is 36 pixels high: is it a heading,; or is this just a landing page with big text?" etc. So maybe I lack imagination, but I don't see it being that easy a problem to solve from a UI point of view.


There are plenty of semantics in InDesign documents. InDesign is all about character, paragraph, list and table styles – it’s what you use to make InDesign work. You wouldn’t have to add much to make those styles do all the semantic work.

Yeah, you would still be able to just ignore all that – just like you can adjust each headline individually in InDesign – but that would only mean that you are an incompetent user of the software. A good WYSIWYG editor wouldn’t always have to produce great markup but a competent user of the software should be able to easily make it produce great markup.

I’m really not that sure why adding a GUI for semantics would be so hard, especially since existing concepts like paragraph or list styles already map very well to HTML concepts.


One big part is that when something is designed for print, it is essentially always designed for a fixed size page, so you don't have to deal with resizing. Then there is the fact that web sites have to cope with people not always having the same fonts you do - which makes it almost impossible to do pixel perfect designs involving text rendering.

Then there is the fact that something like an Indesign document doesn't really have any concept of separating the style from the actual content. For the time being at least, humans are much better at logically ordering and marking up text, and then adding the style with CSS, unlike programs which add a whole lot of extra markup and CSS to do the same thing.

Those are just a few things which make WYSIWIG work a lot better for print than it ever could for web - and there are more... These things will probably be somewhat resolved as time goes on (ie. with wider support for @font-face and so on) but I think that hand written HTML will be better for a long time to come.


I’m not suggesting to slavishly copy InDesign and I’m not saying it’s easy. InDesign would be absolutely horrible for web design. I’m merely suggesting that WYSIWYG for HTML and CSS is not impossible.

Being able to flexibly test (e.g. different fonts, different resolutions, different browsers) would certainly be one of the requirements, as would be the implicit assumption that pixel perfect designs are not possible.

As for style and content? InDesign does actually separate them to a degree. It’s not perfect but neither is HTML at that task. The concept of character, paragraph, list and table styles is central to InDesign. It’s what makes InDesign so great. It even has a bare bones text input mode where all you do is type completely unstyled text. It’s then easy to apply all the different styles you made to your text.

I think a truly powerful WYSIWYG HTML editor is eminently possible, someone has only dare to do it.


WYSINWYG: What You See Is Never What You GEt.


Yes, I also write all my programs in assembly for the same reason...


Regarding constructive criticism, I posted elsewhere in this thread that Muse generated a div to preload hover images. It's a nice bit of usability optimization, but the code could certainly benefit from a dose of minimalism. Even if Muse initially generates the page with this much bloat, I'm thinking it would be trivial [for Adobe] to run the page through an optimizer that rips out anything without style assigned to it. Then it could refactor the stylesheets, getting rid of redundancies, i.e. styles targeting p.paragraph and the like, and run a second round of optimization. Maybe I'm thinking about it wrong, but it would be worth the effort if they could cut the page into even a tenth the original size, which is a gross under-expectation.


But first tell me why designers need html page generator? Is learning HTML and css that much hard in google era?


Do you really think everyone has 100+ hours to spend learning the idiosyncrasies of HTML and CSS? That's asking a lot for someone that just wants to put up a landing page for an idea, or to design their portfolio.

Everytime my artist friend wants to put their portfolio online, I suggest they start by writing some simple markup and half an hour later I'm explaining the finer points of the box model while they whinge about how they'd rather be using Adobe Illustrator. The truth is that there will always be a group of people that are able to visualize what they want to create better than they can verbally, or symbolically.


That's why we have WordPress, Tumblr, etc., even things like launchrock, about.me, shopify, the list goes on.


You mean things that are (eventually) html page generators?


For layman person wanting to create a website with static page it is ok. But I am talking about professional web designers who creates templates for dynamic web page. Why the hell they would prefer generator like this? In the long-term it will turn into maintenance hell.


I have some scripts i use to generate my html. i don't have a framework for generating HTML in as broad a scope as muse because that is a bad idea. Beyond a certain level of complexity, you actually need to stop and think about things.

The right way to do a GUI HTML editor is to provide some templates and some customizations.


Well, at least they're using it. It hurts adoption if people perceive Microsoft to be tepid about WPF or Google with regards to Go. But Adobe is clearly dogfooding.


Adobe will get my constructive criticism when they build something worthy of it, and not a second sooner.

Until then, deal with it.


Since everyone is saying look at the source, might I suggest taking a look at the CSS that is produced for that specific page ... yes, I said page. Each page has its own CSS file that is REALLY big.


The sense I am getting is that GUI generation of markup code is as inevitable as was the first compiler back in the old days. Whether that point is now or not remains to be seen. Back then we had a combination of increase in RAM sizes/processor speed and democratization of Computers beyond advanced research laboratories, that prompted this change. It no longer was necessary to worry about that extra 10 KB or so of cruft that the compiler added. I see similar concerns about file sizes here, whether the point of not caring for bandwidth is here remains to be seen. And could this be the second wave of web democratization? I'd love to see my mum design her website with this.


Are you too young to remember front page?

We've already trod this path and it lead us to dark places.


Like I said the correct time and the correct product will be the beginning of a new era. It definitely wasn't time then, and it may not be now, but isn't it kinda inevitable, that we would someday move to a layer of abstraction above the html code?


That was worse than I prepared myself for!

Almost the entire page is duplicated within a `<!--[if lt IE 9]>` block...


I was shocked too. I couldn't believe the commented section was that long.


I'm definitely playing the devil's advocate here, but didn't assembly programmers have the same kind of reaction regarding compiler generated assembly not so long ago?


Assembly is low level; practically machine code. It is hard for humans to read, and writing a program concept in assembly must go through a lot of layers.

HTML is high-level and designed for humans to read and write. It maps directly to the concepts it conveys.


I registered just so I could say this. People keep comparing HTML to Assembly. Come on. HTML is probably the highest level you can get when talking to a computer, at least for the foreseeable future.

I think the more appropriate comparison is to those software generators, or whatever they are called, in which you drag and drop buttons and text fields and have arrows (or something) to convey action.

You don't see a lot of good software made with those, now, do you?

HTML is not Assembly. It's not even C. It's Python, or Ruby.


At least the compilers produce valid assembler code most of the time.


As Deestan said, there's a massive difference between how low level assembly code is, and how high level HTML is. But even more than that is the fact that they're fundamentally different. Decomposing a sequential list of instructions down into machine code is something that a is fairly simple to do algorithmically. But HTML is not a programming language - it is a descriptive instead of declarative or imperative.

In the end, humans are a lot better at describing things than algorithms, and I think that's the biggest problem with WYSIWYG HTML editors.


"You can design and publish original HTML pages to the latest web standards without writing code" - WTF

I think compounding all this inefficiency is how they have "<!-- group -->" on EVERY SINGLE DIV.

Someone should use the Tilt extension to visualise this page in 3D (http://hacks.mozilla.org/2011/07/tilt-visualize-your-web-pag...) and screen cap it.



Now do it for the remake posted up in this thread.


I tried it, but for some reason the plugin/my browser had a big problem with the remake. There were very few layers, but it was overlaying my calendar onto the image or something strange. I restarted the browser and tried again but it kept messing up.


Bridge to Engine Room: We need more DIVs


My...only....regret...is that....I have......div-itis...


Cap'n, Engine Room here - We've diverted emergency DIVs to sensors, increasing noob sensing range. We'll need more add-ons if yer want mowr DIVs!


I read this in my "Enterprise problem of the week" voice.


This is a great example of following the letter of a standard, but not the spirit.


Has anyone seen the video? They're trying desperately to map print paradigms to the web. It's a major fallacy which many developers (and product managers) fall into.


So it will sell well to companies with a lot of money, then? Sounds like a good product.


It will sell well to companies with a lot of money by telling them what they want to hear, rather than what they need to hear. It depends on your definition of a 'good product.'


I am not surprised. What do you think WYSIWYG is supposed to mean?


Ideally, you'd have "What I See Is What You Get." Then traditional designers (the "I") can have their pixel-perfect whatevers reproduce exactly on every single system/configuration (the "You").

Or, you could have "What You See Is What You Want, Based On What I Think Is A Good Way To Do This (But You Can Also Choose)"

Things like resolution independence, mobile/small display support, gestural interaction support, impaired-senses accessibility, graceful degradation, and multi-browser compatibility all go towards into that, IMO. It may not be identical on every system, but it will look good and be usable.

I can't really see WYSIWYW,BOWITIAGWTDT(BYCAC) being the next sales paradigm shifting quantum leap of buzzwordology though.


Muse (code name) Design and publish unreadable HTML because you don't write code.


On a casual inspection this is just bloated ineptness, exactly what I have come to expect from Adobe.

One passage consisted of a string of empty divs of the form <div class="wrap"></div> which a decent 'code generator' would simply have omitted. In fact the bulk of the bloat appears to be divs which add nothing to the page.

I strongly suspect one previous version of this code generated tables, then they got the "tables used for layout are bad" memo and did a simplistic translation to nested divs.

Perhaps, to be charitable, they have debugging turned on.


Q. How many of your visitors look at your pages source code?

Q. How many times do you inspect the compiled assembly to see what it generated?

One day these tools will be perfect, it's just a shame for me that Adobe have copied what I've done for the second time: http://blog.gameweaver.com/2011/08/16/has-adobe-copied-anoth...


I wouldn't recommend Muse purely because this is how that page renders for me:

http://i.imgur.com/c7tBl.png http://i.imgur.com/6DUW0.png http://i.imgur.com/sfGmv.png


Which browser / OS?


This is a great product, insofar as 10-year-olds will steal it from BitTorrent and use it to get their first taste of web design, then stop using it once they get serious, then reminisce about how terrible their first projects were. Just like lots of us did with FrontPage.


580 . . . Need more . . .

<div id="yo-dawg"> <div id="i-put-div-in-your-div"> <div id="so-you-can-div-while-you-div">This site built in Muse (code name) by Adobe®</div> </div> </div>


There is company that makes a Photoshop plugin that converts to HTML/CSS/Images too

http://www.medialab.com/

I don't know what the output is like, though.


Oh heavens! An ugly html file? We can't have one of those dirtying up our monstrosity that poorly attempts to turn a document into an application.


I agree that coding your design into HTML/CSS is a waste of time generally. I've had good luck with PSD-to-HTML services.

So instead of Adobe automagically converting my design to HTML/CSS/images, you get humans from India (or wherever) doing it by hand.

For $100 - $200 you can get clean, cross-browser, easily tweakable HTML/CSS and images. And you get it in 24 hrs.

I've used www.psd2html.com a few times and the quality was quite high.


I have seen far, far more bloated code generated by Drupal (esp. Using panels module)


Characters: 32,767

Characters after stripping HTML, javascript and CSS: 1,464


div soup is better than table soup!


Makes ColdFusion look efficient by comparison


So they use tables. Big deal.

I don't get that CSS snobbery. Tables do work on even older browsers. That's something you want for a tool that generates code for people who don't want/can't put up with writing HTML themselves. For them the product has to work and look good in most of the browsers.


Did you actually look at the code?


It's not CSS snobbery. It's HTML snobbery, which to some people still matters, for the usual accessibility reasons as well as wanting to write the correct code for the job.


Sure, but if you're hired to write HTML code then I guess you won't be using tools like this. You know how to work around browser perks. But Joe who just wants a website made probably doesn't - and he doesn't care.


Hilarious how many people on Hacker News care what the HTML looks like.

You really are idiots if you're worried about 'that much text' going over the internet. You have no concept of data.


Some people here actually run popular web sites. When you're serving millions of requests a day, a few KB of difference in page size can make quite a bit of difference on how much you pay for bandwidth...

Not to mention that when a browser doesn't render the page correctly, some people don't like wading through thousands of lines of div-soup when they could be debugging one hundred and fifty lines of markup instead...


That is one of the most horrible things I've ever seen. Adobe, please stop having such bad taste. Whenever there's a right way to do things, you seem to commit to the OTHER way.


Wow, looks like you really _can_ create striking sites with Muse: http://maxart.s3.amazonaws.com/muse/index.html




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: