Do you recommend any particular alternative to New Relic? In particular, I'd be interested in something that helps answer the questions "How much memory am I using, and what's using it?" New Relic helps a lot with the latter question, as it can aggregate request time by Rails action. E.g. it can say "30% of your Apache workers' time is spent in this one Rails action." Which helps you understand what's driving the need for more Apache workers, and thus the need for more memory.
We have pretty cool technology to correlate events into bigger buckets (problems).It is based on tracking of each individual transaction flow through the stack plus discovery of infrastructure topology and performance.
I think most people would be OK with an additional flat fee so long as the notice is clear and conspicuous. For example, if I were renting a space daily and had to tack on a flat cleaning fee, I might write something like this:
$99 per day, plus $20 flat fee
In other words, you can develop whatever fee structure you want. You just have to adequately explain it to customers so they can make an informed decision.
Shouldn't the price be $119 in that case? I get bothered when someone thinks I'm dumb enough to not notice their little price increase in the form of a fee. I am not OK with that.
I would prefer to deal with honest folks that just give you the price straight away.
I only interface with assholey businesses that add fees when I have no other choice. I have no allegiance to those types of businesses. I'm thinking Delta, Southwest, every car rental place, Wells Fargo, most cell phone companies, etc.
Shouldn't the cleaning fee only be a one time extra though? $119 for one night sure, but I'd rather pay $515 than $595 for five days. How would you add the flat fee to the nightly rate without knowing how long the stay will be?
Hypothetically, what if you actually believed it in every instance? That is, what if you genuinely felt (as I do) that nearly all jobs are hard. Granted, some may be harder than others, but I've never met anyone who worked a job I would describe as "not hard."
If author had put in an internal check for "do I really believe what I'm about to say" and came back with a "yup" every time before speaking, I'd have no problems at all.
However, as I see it, the pleasant lie baked into the original case is essentially "regardless of what I might genuinely think about what you just said, I'm going to give a response to make you feel good about yourself thereby making me more likeable."
> common tooling like benchmarking and profiling tools
And might I add: A debugger! For both shaders and the entire GL state. Yes, 3rd party tools exist. But more often than not, I find they won't even compile or execute on a given platform.
Philosophically or practically: What's the justification for nearly everyone switching to flat design? Is there any articulable reason it's "better" than the rich, three-dimensional style[1] that was previously popular? Or is it just an arbitrary trend?
Some say the change is driven by high-DPI displays. I disagree. I don't see any intrinsic reason that flat designs look better than rich, three-dimensional designs on a high-DPI display. Without a doubt, flat can look nice, but so can things like this:
Another justification I've heard is that it's a reaction to the excesses of the previous trend. People often point to the leather motif in certain Apple applications as an example of such excess. But first of all, those examples are outliers; few designs actually went that far. Second, the existence of a questionable use of a given style is not an effective argument against that style in general. Third, "some things were extremely 3d, so now we'll be extremely flat" seems like contrarianism for contrarianism's sake.
[1] Some call this skeuomorphism. I tend not to, because the term technically means something narrower than what we're talking about: http://en.wikipedia.org/wiki/Skeuomorph
Frankly, I don't care about the look of the icons (these should be theme-able anyway), but I do care a lot about the actual application user interface, and for this, flat design is a step back into the 80's.
Flat design is good as long as there are no important visual cues lost what elements can be interacted with and which are just passively displaying information. I really do still have problems in iOS7 to separate passive text elements from interactive text elements (formerly called "buttons"). Examples are the contacts list or the famous shift-button of the onscreen keyboard.
Thankfully in OSX 10.10, buttons are still recognizable as buttons, and I like that less radical flatness much more then iOS7. Although, in the current state the UI looks a lot like a Gnome skin which tries to mimic OSX though, I hope that improves until release.
A lot of it is flat, but they use simple bevels for buttons and other clickable elements, and its reuse on, for example, the buttons in the file manager window, looks tacky because it appears cluttered and overused. Also, excessive lines (e.g. for the window borders), high contrast colors, and small color pallet makes the overall appearance not nice to look at. Also the icon set suuuuucked.
'3D' UI elements were introduced later in AmigaOS2.0, and I remember how the user interface style guide made a big fuzz about how UI elements that appear raised are supposed to be clickable while flat or recessed elements shall be used for non-interactive or disabled UI elements which cannot be interacted with.
Actually this site is a pretty awesome collection of operating system UIs. BeOS still looks nice, almost like retro-modern pixel-art.
Older UIs like the Amiga OS one suffered from an even more limited pallet, and even worse typography.
I used BeOS as my main OS for a while and really liked it - it was very snappy and felt more "homey" (that is, more like classic MacOS) than early OSX versions did. (The browser, NetPositive, was absolutely awful though).
pro-3d / anti-flat: details made possible by volume add to the icon's rich design, making its personality stand out, while flat icons results in boring, uniform design in comparison.
pro-flat / anti-3d: the apparent uniformity makes even slight nuances readily apparent and efficient to parse, making each icon stand out against other flat icons, while 3d icons are full of complex and distracting artifacts.
Bottom line: don't mix both, as the result is that 3d crushes flat with its rich details while flat makes 3d look needlessly busy and noisy. Consistency is key to reducing cognitive friction.
I think 'flat design' is really just an example of minimalist design (which has been around for ages) applied to graphical elements. With many things, especially apps/websites, minimalist design often ends up being a good experience for users because it's not distracting, but that's not to say that you can't have great UX without a minimalist design.
For me, I like flat design (to an extent) because it gives the product a more uniform look and feel, but sometimes it can be overdone and I find myself wishing there were an illusion of depth to guide me.
> With many things... minimalist design often ends up being a good experience for users because it's not distracting...
As long as too many cues aren't discarded (iOS 7 went a bit overboard in this regard) I completely agree; shiny or complicated visual elements with "pop" demo well at first but become tiresome.
It's the same reason our keyboards don't make musical tones like they did in Star Trek -- it would quickly become distracting and annoying.
Since the Aqua look came out in OS X, there have been a lot of imitations used on webpages, knock-off UIs, etc. which tended to look terrible, in part because it's hard to do them tastefully. At least flat UIs are harder to mess up.
It's a fad. Apple came out with it 'first', making it automatically used among millions of people, then designers followed with other software packages and apps.
Yep, but that's how it works. A year ago, people were complaining that Apple sucks because iOS was still using skeuomorphs instead of flat design like Android and Windows 8. Now they've switched their style, and they suck because they invented flat design and everyone is just copying them.
Gradients and juicy-fruit graphics are like candy -- they provide an instant and strong positive feeling, but they don't age well. If you had to eat nothing but candy, you'd hate the sugar-coated morass your mouth had become. In general, a flat icon tends to age better and reduce "eye exhaustion" (not a physiological thing, but a psychological one). Note that this applies mostly just to icons; the entire UI is not so tightly bound by these rules.
However, design is also trend-oriented, and flat is a rising trend.
> I can't think of any better way of describing black.
That's probably the most precise and scientific definition of true black, but it's not how most people imagine a black material.
Any black material you'll encounter in everyday life is still quite reflective in comparison to this high-tech, superblack material. If you see a man in a black suit, you can still see the buttonholes, the lapels, the wrinkles, and the three-dimensionality of the man's body. That's because it's actually reflecting a lot of light.
But if the suit were sufficiently light-absorbent, you wouldn't see any of that. It would look like a homogenous blob of solid color--just a silhouette. One can simulate that experience to a certain degree with photography:
In person, one can also get a sense of that by looking at an extremely high-contrast scene, e.g. a person in a black outfit with strong backlighting. But such a picture feels much more natural to us than would a suit of true black in a room with normal lighting.
> Silencing the ongoing debate, even if one side or the other is obviously delusional, is completely contrary to the spirit of science.
A news outfit does not have the power to silence scientific debate. Real, productive scientific debate does not occur in 5-minute talk show segments. That's just entertainment. Scientific debate occurs in less glamorous venues, such as conferences, journals, and labs. And it involves a great deal more time and technical detail than what you get on a news show. News organizations don't control those venues.
Now, one can of course raise valid concerns about process in the scientific community. Much has been written along those lines as of late. But that's entirely separate from the BBC's policy, and out of the scope of this discussion.
News influences politics. Politics influences public policy. Public policy influences funding. Funding influences research. Research influences news.
A news outlet does have the power to silence debate. Ask any "true Scotsman" libertarian (viz. not Paulians or Tea Partiers) during any election season. Ask Occupy Whatever about why no one actually addressed their grievances. Ask any number of people with legitimate but unheard issues why 24-hour news channels will choose to report deeply on shallow topics rather than broadly on a wider variety. Rupert Murdoch and Sumner Redstone haven't accumulated vast media empires because they want your voice to be heard by more people.
The actual scientific debate is not going to appear on air. The news segments would simply get people interested in learning more and contributing in some way. If you shut the consensus contrarians down now, you may be stifling future climate efforts, because the research industry has to continually struggle for funding for anything that does not have a direct and obvious military application. Any mention of science on radio or on television or in popular fiction is an advertisement for more science. The crackpots and loons give the scientific establishment an acceptable target to talk down to that will not offend the general public. That informs the public while recruiting them emotionally.
It is a rhetorical tactic, essentially making a strawman of an actual person, and not true argument, but I'm not certain rational, scientific personalities realize that the majority of research is ultimately funded by folks who couldn't tell the difference between a beaker and an Erlenmeyer flask if you make them watch Breaking Bad from start to finish. Someone has to be the pitchman for legit, non-politicized research. Bill Nye and Neil deGrasse Tyson and Michio Kaku and Stephen Hawking can't be the only science celebrities.
You really do want as many actual researchers in front of the cameras as you can get, so you can find out who can both defend a thesis before their peers and effectively convey understanding to laymen. You want people to be able to name more professional theorists and researchers than Kardashians. It doesn't advance the body of knowledge, but it helps pay for salaries and equipment, and helps fill the education pipeline to keep retirement and emigration from endangering ongoing work.
While Python has a healthy web dev ecosystem, Ruby's feels much larger to me. That's almost certainly because Rails is so wildly popular. And Rails is an excellent, mature framework. So for web dev, I would consider Ruby the winner.
Python is the clear winner for scientific computing. That's not really due to anything
inherent in the language. It's an ecosystem thing. If you were using Fortran before,
you might be working in a problem domain where Python dominates.
Both are excellent for miscellaneous scripting work. E.g. reading in a CSV file and doing something with each row; batch-processing a bunch of images; renaming 1000 files according to some ruleset; gathering some system data and sending nightly status emails.
In terms of syntax and features, they're very very similar. Python has meaningful
whitespace, which you may like or dislike. (I think it's good for enforcing proper
formatting, but you're free to disagree.) Ruby has multiple ways of expressing the
mathematical concept of a function (methods, blocks, and procs), which has its pros and
cons. Both have metaprogramming facilities, though I find Ruby's more pleasant. If I remember correctly, it was in large part the metaprogramming that made DHH pick Ruby for Rails.
I kind of see what you mean :) at my university they announced a python course for developing Computational Fluid Dynamics programs (aerospace engineering), so my first vision of Python wasn't too much for web development but for engineering/scientific programming like fortran is. Although it's a popular choice for web apps. After having worked with fortran I found a bit more attractive Python because of all the indentation and syntax style. I also read that Paul Graham likes a lot more python haha
But many other readings and the fact twitter and groupon were coded in ruby, made the choice! I've read that twitter now is moving to scala, but as there are a lot more resources out there for learning ruby/python, I kicked it out of the list.
I like ruby, sometimes it seems a bit confusing when there are many ways to express the same thing, but it's approach to natural language is helpful.
Thank you all for your quick responses! It's a pleasure to join the HN community!!
> While Python has a healthy web dev ecosystem, Ruby's feels much larger to me.
It's funny because I actually feel the opposite: Ruby is mostly a niche language for the web community, whereas Python is now mainstream in so many different areas (graphics, scientific, sysadmin, financial...).
I've yet to see a non-webdev tool embedding Ruby as a scripting language, while Python is now featured in office suites, spreadsheets and enterprise software. Anywhere simple and clear syntax is more valued than metaprogramming features, Python is likely to appear at some point.
I think we have almost the same opinion, actually. I said Ruby's "web dev ecosystem...feels much larger to me." I agree that Ruby's strongest niche is web dev. And I think it has an edge of Python there. Outside web dev, I don't see Ruby dominating any particular niche.
It's not a Rails vs Django feature comparison that gives Rails the edge. Convergent evolution means they're pretty much on par with each other all the time.
The difference is in the size of the ecosystem as a whole. The Ruby web dev world appears to have more (or more visible) participants. Which affects things like Stack Overflow, web-specific packages on Github, blog posts, etc.
It's not an order-of-magnitude difference, as far as I can tell. But still significant.
It was particularly those web specific packages on github that I was curious about. For example, is there a package for doing CAPTCHAs for rails that's better than all of the ones for django? Better admin apps? Better apps to help you integrate with bootstrap more easily? Or create REST interfaces?
These things would make a huge difference to productivity if one platform had more (and better) of these things, but I haven't noticed that to be the case.
For any given fringe view that the media properly marginalizes, some people will go on believing it. But that doesn't mean the number of believers (or undecided people) will stay the same. It's quite possible, and in my view probable, that reducing airtime from fringe views will substantially reduce the number of people who hold or lean towards those views.
You're at least partly right, but I would argue the conspiracy types are already well acquainted with "alternative" news sources (infowars, Coast to Coast, random bloggers...).
Additionally, (and this is just a personal feeling), but something about top-down "reducing airtime from fringe views" feels very wrong. Who decides what a "fringe view" is? By that rubric, Creationism is perfectly legitimate since it's widely held.
I think we'd all be better served, morally as well as practically, by letting the loons be loons and let people make their own decisions rather then trying to explicitly control who gets airtime. Bring on the denier and let them get annihilated in a debate against someone who knows what they're talking about. This has the benefit of not invoking censorship boogeymen (which may or may not actually exist) and doesn't hand over the "we're being oppressed" card to them.
Climate change scepticism is not a fringe view, but moderately widely held. It is hard to believe that those people will suddenly support environmentally literate policy. They will just switch to a different set of trite anecdotes to prove why nothing should change.