Hacker News new | past | comments | ask | show | jobs | submit login
DeepMind in “very early stage” talks with National Grid to reduce UK energy use (arstechnica.co.uk)
254 points by rbanffy on March 20, 2017 | hide | past | favorite | 110 comments



The DeepMind datacenter project was very interesting, but a lot of the ML people I spoke to were quite dubious about how much of it was genuinely down to new AI/neural networks, and how much of it was Google PR to justify how much they spent on DeepMind.

> DeepMind trained a neural network to more accurately predict future cooling requirements, in turn reducing the power usage of the cooling system by 40 percent.

But when you look at the DeepMind blog post (https://deepmind.com/blog/deepmind-ai-reduces-google-data-ce...), it looks like the 40% model is comparing to a baseline of doing nothing. So the question is : is this really something you require an AI research powerhouse like DeepMind for, or is it something a regular data science team could do?


I think this question applies to all applications of machine learning. PR pieces love to label every application of ML to a novel field as "artificial intelligence," even when the predictions are simple enough to be predicted by statistical regression (and in many cases that's effectively what is happening).

So yes, many applications of ML are something "a regular data science team could do." However, isn't the main benefit of ML removing the need for a team of data scientists? Of course you will need some data scientists / ML engineers to support the ML product itself, but ultimately the gains will be realized from the ML doing the work that traditionally required a team of data scientists.


I don't think your second point follows from your first. ML is not easy to set up or maintain, even if we're just talking about linear regression. Companies trying to make 'data scientists obsolete', usually frame ML as a black box that you can plug data in one end and spit out profits the other. But in reality it doesn't work like that, significant effort has to build and maintain data pipelines, to make sure data quality is good, and to monitor that the ML model is doing anything useful in the end.


It can be easier to set up and maintain than hand-rolled solutions. Best example I can think of is speech recognition - previous systems used hidden markov models, gaussian mixtures, triphones, and all sorts of complex and obscure things that had to be tuned by experts. Now they just use an end-to-end neural network that goes directly from sound to letters (almost anyway; I believe the input is still MFCCs).

I agree it's not magic but it neither are 'manual' methods and at least the maths is a lot simpler.


> Now they just use an end-to-end neural network

You've missed my point. The thing is that building the ML model is NOT the hard part of machine learning in industry. The hard part is building an infrastructure that can make that machine learning model do something useful. It is much harder than people imagine. See for example this great paper by google for more details : https://static.googleusercontent.com/media/research.google.c...


I don't think either of you are strictly wrong, though--the comment's point seems to be that doing things by hand was so much harder that it couldn't even be done to the same level of success that machine learning achieves.


Unless you know something I don't, the SOTA in ASR is a hybrid system that still has all the complexity of the traditional generative model plus NNs on top.


That's true, but I've noticed quite a few pieces of deep learning NLP papers take pains to use competitive baselines (eg state-of-the-art non-DL techniques, more complicated than regression) and have heard that good venues will reject your paper/talk if use poor baselines


I was going to say, I feel like this could be done with excel and some historical data combined with weather forecasting.

that being said, even if ML gets you a 0.5% advantage over spreadsheet math, that is a non-trivial amount of savings on a national scale.


Yes, and there are two other points to follow up this one:

If one ML solution is better than the other one by only 0.01%, in this context it is worth a lot. It is a winner-take-all type of situation.

Edit: the second point was this: If one ML algorithm gives us a 1% saving and the other one gives us a 0.9% saving, you could say that is a 0.1% improvement or you could say that is a 10% (of 1% saving) solution. A percentage sign without context is not that meaningful.


On the other hand, if ML gives you a 0.5% disadvantage, that too is a non-trival amount of losses on a national scale. Because they compared against baseline rather than against another data science team (or both), it's difficult to know if ML actually brings an advantage here.


Can't we assume they already have a data science team? Optimizing the grid is their job already right?

From what I've seen in several places though is companies tend not to listen to their own people, but are willing to make changes based on the advice of hired "experts". So the experts will come in and ask everyone what the challenges are and what they might do to improve. Then the "experts" will pick several low hanging fruit and find a few things that cross internal boundaries (the harder things to change) and make recommendations. Upper management will go with it and improvements will be made. I suppose the way to sum it up is that they bring objectivity to the table.


What I'm wondering is why I can't find anything about ML applied to finding and curating data, which is the most tedious part of data science. That would be an interesting way of using ML without fuzzy stuff.


There are a few current projects, particularly in database research. I don't know how many of them use ML in the traditional understanding. Current projects I know of are Wrangler, Mimir, Katara, MayBMS (in no particular order).


Did you try any of them. Wrangler seems like a tool similar to OpenRefine to me.


Would you pay for it?


But is it non-trivial net of the expense of gaining it?


That is a much easier decision to make, but i have no idea what the costs and savings involved are so I do not know.


>> Google PR to justify how much they spent on DeepMind - why would google try justifying for Deep mind and not to other tens-to-hundreds of companies it acquires in a year ?


DeepMind's expenditure is probably upwards of $1bn/year.


What costs do they have that would be one billion a year? I can understand servers and salary for the researchers but not to the scale of 9 figures a year.


I think you could make a similar argument for a lot of important choices: when you pick a doctor, an employee, a job, a computer, you're going to pick the best you can afford even if the middle of the road doctor or computer could get the job done. Generally people want the satisfaction that they chose the best in the business and so aren't leaving anything in the table.


From the blog post, it seems that the 40% reduction is compared to the current utilization:

"by applying DeepMind’s machine learning to our own Google data centres, we’ve managed to reduce the amount of energy we use for cooling by up to 40 percent. [...] Given how sophisticated Google’s data centres are already, it’s a phenomenal step forward."


That "up to" renders the rest of the phrase meaningless. Was it 40% less for one second/minute/hour/day?

I assume, because they haven't actually given an impressive number without massive wiggle room, that it's because they are clever enough to know the real number is embarrassingly low, but would welcome corrections.


+1

I worked at one of the datacenters where they rolled this out. It definitely wasn't all that impressive in terms of real energy savings, it would temporarily make our PUE drop, but we'd have to make up for it after the recommendations were secured in order to account for the temporarily relaxed set points. It was mostly a PR thing from my point of view.


Not an expert but I expect that what AI/machine learning/NN shines is the ability to produce very high quality solutions (or approximations) for large-scale NP-hard problems. With regard to the "NP-hard" part, I don't think regular data science can be a sufficient substitute.


What do you mean by "regular data science"? Deep learning is just another tool in the belt of a data scientist. A deep learning solution doesn't just spring up on its own. At this point in time anyway, all solutions are engineered by engineers and data scientists.


DeepMind will go down as one of the best acquisitions ever


Heh. If only they mandate to replace a majority of the crappy wooden + single-pane windows they have in most houses, that let all of the heat out and wind and noise in, with normal, double-glaze windows with good insulation, they would cut the energy footprint in HALF, I'm sure...


>crappy wooden

I'm not sure where you got the gripe against wood from. It lasts longer than uPVC given maintenance every 5 years, looks better, is more environmentally friendly and has similar thermal properties. Oh and it doesn't go yellow.


Nothing against wood, really. But if you've been to the UK, you should agree that the unmaintained, not-properly-fitted, wooden frames that they use in most old windows are just crap. They let in huge amounts of wind, cold, etc, through the unsealed seams. Anything that improved that would be 10 times better, regardless of the yellowing, etc.


Most wooden sash (I assume you're talking about wooden sash) were properly fitted at the time, but are victorian or edwardian and therefore over 100 years old. They've done pretty well considering their heritage. New wooden sash with double glazing are better IMO than plastic windows - as the parent noted, plastic doesn't do well long term when exposed to light.


>New wooden sash with double glazing are better IMO than plastic windows

Hugely, I think it's a testament to the design that they still look great today. Double glazed wooden sash windows are however much pricier than a uPVC alternative which puts them out of reach of many.


So then, we all agree that the old (ancient) and unmaintained wood frames in the UK are crap. Cool! :)

Really, I couldn't care less if they replaced the frames with wood or whatever, as long as they made sure that they were thermally good and avoid wasting all of the heat...


No.


I had my sash windows refurbished a few years back, adding double glazing and draft proofing. They said the frames were the original wood from around 150 years ago, although they said the glass was from the post-war period (in common with most windows in London, given most were shattered at some point during the war). There's no way uPVC frames will last anything remotely approaching that amount of time.


>you should agree that the unmaintained

Agreed, but wasn't this also an issue with steel windows in the 60s? uPVC put in in the early 90s already looks crap (IMO).


So then, we all agree that the old (ancient) and unmaintained wood frames in the UK are crap. Cool! :) Really, I couldn't care less if they replaced the frames with wood or whatever, as long as they made sure that they were thermally good and avoid wasting all of the heat...


The amount of glue used to produce the wooden windows is so huge, that PVC windows are a much better choice both environmentally and for your health. There are different colored tapes that can be applied to make it look like wood. But they are still uglier.


I suppose you are joking, as double-glazing seems to be a popular meme in the UK (with people being convinced they don't need it and scammy vendors pushing it)? Must admit I never completely understood it, coming from a country where double glazing is the norm.


Can't understand your comment. You are agreeing with me that, in most other parts of the world, double-glazing (aka a glass that won't break if you elbow it by mistake!) is the norm. But in the UK, it's sadly the opposite. So we agree that they should replace their crappy windows with double-glazed, properly insulated, ones...


I meant I never understood why people in the UK dislike double glazing and think of it as a kind of joke. Don't know much about the story of double glazing in the UK, though.

The only thing that comes to mind is that climate is supposed to be mild in the UK, so maybe double glazing makes less sense than in colder countries? But it can get pretty cold in the UK...

My comment wasn't meant as a criticism of your comment.


Double-glazing is popular in the UK - but it was a late-adoption: I didn't see it become popular until the early-1990s (I note this would be shortly after the privatization of British Gas, hmmm!)

A contributing factor is the popularity of rented accommodation, including council-owned houses, where the pressure to invest in double-glazing is less (as tenants are responsible for gas/elec bills, but landlords are responsible for capital-expenditures and improvements) - so there's not much incentive to upgrade windows in that case.

But all new houses I've seen built since the late-80s all tend to have double-gazed windows, the majority have white PVC framing, but I see wooden ones occasionally too. I've never seen white PVC window framing turn yellow - my parents had their windows converted around 1994 - now 23 years later the frames are still pristine white - and I'm not aware of any special treatment or care they require.

That said, public perception of double-glazing salesmen isn't the best - they were the butt of many a joke in the 80s and 90s, including Blur's Parklife video: https://www.youtube.com/watch?v=YSuHrTfcikU - though I never understood myself, perhaps because they were a commonplace sighting?

That... and even today they seem to capitalize on their public perception with intentionally gauche TV ads: https://www.youtube.com/watch?v=OgqJPd_YJtE


Thanks for the clarification!


I think this is itself a "legacy joke"; I would say it was the 80s-90s when door to door double glazing salesmen were rife. The only real objection was the sheer cost of retrofitting it, plus a bit of stiff upper lip about not caring about being cold.

A friend of mine lives in a single-glazed 13-storey 1960s apartment block. It can be quite drafty. There's no way it will ever be refitted and I'd give it a couple of decades until its demolition.

The UK's housing stock is very old on average and some of it is in very poor condition.


>> But in the UK, it's sadly the opposite.

Any data to back this up? Personally I find it a pretty rare sight to see windows that aren't double glazed and if I came across a property without them it's absurd enough I wouldn't move into it.


Sure, it's annecdata... but just come to London. All of the old housing from the 70s or before (most of Z1-Z2) is single pane. Also, go search in Zoopla for double-glazing... landlords mark it as an extra for a reason (unless it's a new development, in which case it will be double glazing).

Well, I've talked to people and been to quite a few places in the UK, Scotland, etc. I know many people living in London, the UK and Scotland. We all agree. I'm pretty sure it's the same all over the UK

And for the rental prices that you pay here, it's pretty shameful the state of the houses, windows included.


Interesting. Maybe it depends on where the property is located within the UK. There was a big double glazing push in Northern Ireland in the 90's and I can't remember the last time I was in a property without it. I've rented in London (Z2) a few times and haven't come across an issue yet but I may have been getting lucky. The different heating requirements between Southern UK and Northern are pretty astounding so that might have something to do with it. I haven't used the heating in my London flat in about 4 weeks now and throughout the winter 4 hours a day was plenty. In NI I would have been using it for another couple of months at least and almost constant during winter.


They kind of did that with the Green Deal (helping people finance home improvements which boosted their energy efficiency) - https://www.gov.uk/green-deal-energy-saving-measures/overvie...


I'd love to see the results... I've never been so baffled in my life, looking for a "normal" flat in London and looking at horrible prospects that cost a fortune with windows from a century ago. And when I asked about that, they were all saying "oh, but the house is listed! Can't change the windows unless X, Y, Z". Whatever.

We can put people in a rocket and take them to the moon but we can't replace windows and make them look identical to the previous ones but with all of the modern efficiency. Because oh well, the building won't be looking exactly like two centuries ago or something.

Completely crazy, let me tell you... :(


I'd be surprised if it was genuinely because the building's listed, it's usually just that the landlord doesn't want to make the investment (and the market doesn't demand it). Sadly, until the law requires rentals attain a certain level of efficiency, the same way you need something like a gas safety certificate, I can't see it changing.

And apparently the Green Deal was a failure - https://www.theguardian.com/environment/2016/apr/14/green-de....


Someone's taking the piss. There's almost no chance a "normal" flat in London is listed. Conservation area, maybe, but that's a different kettle of fish. Most likely your prospective landlord's a cheapskate.


I don't know about London, but my very normal flat (currently renting below LHA rate) in Cheltenham is grade II listed.

There are about 2000 - 3000 grade 1 and grade 2* listed buildings in London. 92% of all listed buildings are grade 2; the other 8% are grade 1 or 2. So there's likely to be a fair amount of properties that are listed.


I agree most buildings won't be listed and most landlords are cheapskates. But there are quite a lot of listed buildings in London:

http://londonist.com/2011/06/all-listed-buildings-in-london-...


There's a history of people hiring the wrong builders and buildings being destroyed, I should imagine it's partially to hedge against that.


If you take the example of something like a Crittall steel-framed single-glazed window, a classic of the 1930s, there's just no way to fit double-glazing into the frames without destroying the appearance. The problem of thermal bridging (the metal conducts heat straight through) is also difficult to solve without thickening the frames. Maybe future technology will solve these problems. It seems more likely that the appearance of those buildings will take a hit due to the urgency of saving energy.


Some streets, particularly in London, are very picky about modifications to houses. Even if it's not a listed property, you might need to get planning permission to put double glazing in - if it'll make your house look out of place with the neighbourhood the council can disallow it.


No You cant I looked at buying a grade2* listed property (mentioned in pevsner no less) 1800's alms house.

You even had to use authentic rubbish paint on the doors


So that's what I'm saying. Instead of useless laws like that one, the Government should put a lot of weight and money and refurbish all of those windows, making them modern while using all those "original materials" or whatever they want.

Instead, I'm sure half the energy footprint goes to waste through those windows...


I'm not sure many people would agree with you that laws to prevent the destruction of historic buildings are "useless".


LOL please go look at a map of London, Z1-Z2. Protection? It's almost like every other building is "protected"!

Plus, read my comments. I'm saying the government should take care and sponsor the fixes to ensure the quality while greatly reducing the energy bill (that's my thesis).


London is 2000 years old. There are lots of historic buildings. There are energy saving grants available. Building regulations on insulation (including glazing) are very strict, but they're not retroactive until further work is done on the building. That's when you do see the double glazed units in sash windows. These are, incidentally, very expensive items, but then so is most listed building work. If you don't want it, don't buy a listed building.


That's what I think any time I'm in the UK. But most buildings are either protected (happened in our office, windows could not be double glassed due to protection and they added a new, secondary window for insulation "behind"), or nobody wants to spend that amount of money


Not wanting to invest in the building is by far the more common reason.

Especially when the tenant will take all the advantage of lower bills, increased comfort etc.


I think 'Most' is a huge overstatement, the last house I lived in without it was a student dive in 1996.


Really? Check this: https://www.bre.co.uk/filelibrary/pdf/rpts/countryfactfile20...

Page 3, Summary:

Around 83% of homes in each country have some double glazing. Northern Ireland has a higher percentage of homes which are fully double glazed, 62%, compared with 32-44% in the other countries.

So, ~60% of the houses are NOT fully double-glazed. In 2007. Srsly. And sure, it's from 2007... but I highly doubt the situation has changed...


Why would you doubt the situation has changed? Also, not fully double glazed would cover my current house which has one single glazed tiny bathroom window with an extractor fan in it.


Since quite a few people have been commenting without even googling some data, here we go:

https://www.bre.co.uk/filelibrary/pdf/rpts/countryfactfile20...

Page 3, Summary:

Around 83% of homes in each country have some double glazing. Northern Ireland has a higher percentage of homes which are fully double glazed, 62%, compared with 32-44% in the other countries.

So, ~60% of the houses are NOT fully double-glazed. In 2007. Srsly. And sure, it's from 2007... but in my expericence, and in talking with a lot of friends, in a main city such as London and in other small cities (e.g. Fareham in the south), the situation is quite similar nowadays...


On a similar note, for anyone that hasn't seen it you can get live grid metrics from http://www.gridwatch.templar.co.uk/.


what is this, a website for ants? It needs to be at least twice as big!

Seriously thought, that's quite an unreadable interface.


It looks great on a 22 inch screen, and likely even better on a nice 50 inch monitor - but on a normal laptop screen it is indeed nuts.


There’s a huge conspiracy theory brewing about the mandatory rollout of smart meters. They’re going to absolutely freak out when they learn that Google’s Deep Mind will be watching them.


I like these guys who are already running a "virtual power station" which can smooth out the peaks by switching off non-vital power loads:

http://www.openenergi.com/

With Machine Learning, natch: http://www.openenergi.com/virtual-power-station-with-big-dat...


This title misses the following words; "suggests they can", "proposes to", or my personal favourite "something something ai solves all your problems"

come on folks, we can be better than this.


We added "very early stage" to the title since that's what the National Grid is quoted as saying in the (you're right, extremely frothy) article.


I doubt this will use deep learning though. DL works best at detecting patterns in large quantities of data, whereas the time series data here is limited to just a few years...and the system has changed over those year, and continues to change significantly. I'd guess that 10% is the cost of ensuring supply, i.e. spinning up generators as contingency - any attempt to shave saving from that may increase risk of power outages. Doing so while maintaining the same power outage risk is the goal here, but I think it would be hard to prove that some new clever strategy has the same risk levels. In this respect 10% sounds ambitious to me.


I know a couple people working on using deep learning for financial timeseries. It can be beneficial even with relatively limited data (a couple of years with high granularity).

I also think the 10% figure is quite ambitious; especially given that they are at an early stage of negotiations.


It might be reducing transmission losses by 10%.

Ie. trying to prefer supply near users to reduce losses in long transmission cables.

The whole thing seems rather tricky though, because the entire grid right now is run on a market based approach, where suppliers and users bid for the right to sell/use power every half hour. The ML in that case would have to be given to every company to make smarter bids.


And reducing the use of standby gensets which is very expensive not to mention polluting


Yeh I'd guess electronic exchange financial time series have a lot of interesting dynamics at very fine detail because of high frequency trading, so it's a very different system than a power grid in that respect. And for the record I regard HFT as mostly being a wasteful activity that effectively amounts to a tax on using electronic exchanges - ok there's a fuzzy line between arbitrage (useful) and full on HFT (which is clearly pointless IMO).


I'm involved in very-unsexy not-Google-level smart grid research using machine learning. A Phasor Measurement Unit (PMU) like those being deployed in America takes sixty measurements a second for each signal, and this is in no way overkill for the granularity of information required.


Depends; the input data set could be massive, incorporating things like TV listings data (half-time kettle surges are a huge part of our power consumption profile), weather, publicly listed events, perhaps scraped from websites and so quite dirty.

I can see this transitioning quite easily from a 'we can sanitise this data ourselves' job to a 'screw it, let a massive neural net figure this out' one.


My initial best guess is that the style of analysis you're suggesting (TV guides, etc), doesn't offer sufficient safety margin in the event that a prediction is wrong. If reducing the safety margin was an option then there are probably lots of things that can be done, but if the goal is to maintain a safety margin akin to the current one then my instinct is that the gains to be had are minimal - and that the interesting work is in power storage (battery tech, cryogenic storage, etc.)


Predicting peaks in supply and demand for energy is essentially turning the utility into a hedge fund.

I think it's worth considering the overall characteristics of the "improved" algorithm, since the ML optimizations are likely analogous to leveraging based on an overfitted predictive model, and the objective of a power grid is resilience as well as efficiency.

Also, depending on how you define efficiency, it may be considered beneficial to shut down a coal plant and spin up a few hundred windmills, even if the resulting price per kilowatt hour increases by 5%.


I work in the industry and we've been "predicting peaks in supply and demand for energy" for years.

You're 100% correct about the "resilience" and "efficiency" part, and we currently have software for that as well (Google "Real-Time Contingency Analysis"), but the real improvements to be made in this area would be things that generally fall under the "smart grid" buzzword. The ability to automatically switch transmission lines in and out of service during an emergency event, better and/or automatic control frequency and voltage (which we do have currently, but could be further automated), and reaction/recovery/restoration after a relay incident (for example, an automatic "Blackstart" after a voltage collapse) would all fall under the kind of improvements needed to push electric grid technologies to the next level, in my opinion.


Very interesting! Thanks for the clarification. Would you say that the ML/AI aspect is "easy" once the automation is built? Or does the automation offer limited value without fairly sophisticated algorithms?


Well, the grid already has a prediction system, this is just supposedly improving that. Which makes me question whether it can actually reduce overall consumption, because surely that requires some of the consumers to change behaviour?

Dispatch priority already ensures that if wind is available it will be used in preference to coal: https://www.economy-ni.gov.uk/consultations/priority-dispatc...


I don't want to give any hint to the DeepMind, since I an inclined to think that in this field an expert assessment can be better that DeepMind advice. Just to give a simple question or example of the kind of knowledge involved in those predictions: Since energy generation and demand depends a lot of weather conditions, do they have any state of the art machine learning model to forecasting weather conditions?, can they predict the evolution of energy prices?, can they measure the impact of new improvement in reducing the cost in renewable energy, what about the impact of the brexit in the energy market?, are they first class experts in times series or are they simple to apply current technologies like profet or Hyndman's R package for ts. How can they defend that technologies applied at go game like reinforcement learning can be successfully applied to forecasting? Perfect games simulations allow you to get almost infinite data, current world market and weather has only very limited amount of information to be supplied for a model, how can they justify the application of big data techniques to a small available data world. I hope that some of those questions can be addressed, unfortunately while writing this I haven't read the post yet.


Reinforcement learning is used in spam detection and control amongst other things. You can do data augmentation and also do transfer learning.


In this concrete case, how do you do transfer learning? what is the domain you have experience to transfer to the energy of energy? Also, Bayes's naive algorithm can be used in spam detection and usually it gives good results, is RL such a great tool in spam filtering when there is moderate data?


If you're in Cambridge (UK), Demis Hassabis (CEO of DeepMind) will be giving a talk to CSAR as part of the Cambridge Science Festival at 7:30 this evening: http://www.sciencefestival.cam.ac.uk/events/towards-general-...


This is about prediction of supply. In other words, becoming a weather forecasting organization, mainly wind and clouds/shadow.

Demand is predictable so little to gain there.

How big is the market of providing excess demand? If I can turn on the AC or charge electric cars in the millions, what can I earn as a company?


I don't know much about electric grid engineering - are there opportunities for a ML approach to increase efficiency in ways other than the sort of better supply forecasting implied by this article.

The article does mention the losses involved in long distance transmission, but surely traditional approaches can already yield fairly well optimised planning for improving this sort of efficiency?

(Finally, this article seems quite light on details to me & doesn't mention a source Google press release or anything like that with more specifics. Maybe just better supply forecasting could yield bigger benefits that I imagine...)


(copy-pasted disclaimer: I'm involved at a junior level in very-unsexy not-Google-level smart grid research using machine learning, and my power grid engineering knowledge is painfully limited.) Predicting demand is dicey dicey stuff, but throwing sensors on the grid gives a lot of information that can imply other factors. Just pulling this out of the air for an example, but: UK power is famous for having to deal with a sharp spike in demand when a big TV event is going on and then goes to commercial, because everyone in the country in synchronized fashion gets up, goes to the kitchen, and turns on the kettle for tea. (Really, it's a big deal!) Now, that's not the best example because of the dramatic nature of the spike, but you can see how power information might on some level reflect the state just before that spike: people aren't moving around their homes, vacuuming, w/e, they're in front of the TV, right? So our demand forecaster learning from the data might not be able to tell that a new season of Sherlock is airing, but it might learn enough paranoia about everyone-watching-TV-at-once patterns to be useful.


DM is far from the first company to provide ML solutions to revolutionize utilities.

There are so many opportunities to increase the efficiency of our electric networks. Forecasting demand is not something they do well, but more importantly they could improve Demand Response and Energy Efficiency programs. Oh, and most of the techniques they use to prevent and stop theft are a joke (that's a $6B/yr problem in the US).

The utilities have not been forced to innovate. They won't innovate on their own because there are no customers at risk - no competition. (Aside from smart meters and the main benefit from that was that they no longer had to pay for meter readers.)

There is a WORLD of opportunity for utilities to become more efficient, but they will not do it on their own. Our regulators need to force them to innovate.

Kudos to DM for this work, but I will be more impressed if they can actually get a major utility to implement these solutions.


"Forecasting demand is not something they do well, but more importantly they could improve Demand Response and Energy Efficiency programs."

- Source?

"The utilities have not been forced to innovate. They won't innovate on their own because there are no customers at risk - no competition."

- Also, what data can you provide to back this up? I work in the industry, and I can tell you that innovation will depend largely on the type of energy market that the utility operates in, whether or not they are a vertically integrated company, regulated or unregulated, IOU or POU, as well as a ton of other variables. So while maybe its true that not every single company is innovating, to generally say that they "won't innovate" or that there is "no competition" is simply wrong as well as spreading incorrect information about the industry.

I'd check out this is you're interested in learning more about the utility industry: https://www.osti.gov/scitech/biblio/15001013


“We think there’s no reason why you can't think of a whole national grid of a country in the same way as you can the data centres." means that you haven't thought about it at all.

A grid has all sorts of actors involved, a whole bunch of whom are there purely to make money (manage risk, gamble... tom-a-to/tom-ar-to) - bolting an AI onto a complex system and then letting people !#$!@# with it to get their hedge contracts into the money is practically a guaranteed scenario.


Or run some TV ads telling people to turn their heat down and hang up their laundry..


What is 'UK energy use'? The title should tell what the article is about.


It refers to the generation of electricity. "National Grid" implies that, since that is the name of the system of wires that carries electricity throughout the UK. But only if you have the necessary context.


I interned at Numenta [0] in 2012. Numenta is building an open-source machine intelligence product [1] based on the human brain. Specifically, the algorithms are based on the theory of "hierarchical temporal memory" (HTM) [2] as described by Numenta founder Jeff Hawkins [3] in his book On Intelligence [4]. The basic idea is that the neocortex has a generalized learning framework that acts on generalized input from all five senses. For example, one study enabled blind people to "see" a ball in front of them and grasp it with their hands, by encoding the image of the ball onto electrical impulses on taste buds, using a device in contact with the tongue. I can't find a link to the original study, but here's an article that describes one such device. [5] The conclusion drawn from this is that, even though blind people have lost their sight, their neocortex is able to understand inputs from any sensory source (in this case, tastebuds). The Numenta algorithms generalize this idea to feed time series data into a virtual "neocortex" that builds custom ML models for each data source and chooses the best one (a form of online learning that eliminates the need for data scientists to create custom models for each data source).

Because the Numenta algorithms are optimized for time-series data, their areas of strength are prediction and anomaly detection. When I worked there (5 years ago, they've made a lot of progress since then), one of their main "case studies" was energy usage in large buildings. Jeff Hawkins gave the keynote presentation at Strangeloop 2012 where he discussed this specific application of the algorithms, lowering energy bills in factories by predicting the next day's usage in advance. [6]

My understanding is that the algorithms worked particularly well for energy consumption data (e.g. energy drops at night on weekdays, drops on weekends, spikes in meeting room X at 10-11am every day, etc). I would not be surprised at all if DeepMind is able to capture savings using similar methods.

[0] https://www.numenta.com/

[1] https://www.numenta.org/

[2] https://en.wikipedia.org/wiki/Hierarchical_temporal_memory

[3] https://en.wikipedia.org/wiki/Jeff_Hawkins

[4] https://en.wikipedia.org/wiki/On_Intelligence

[5] http://www.theplaidzebra.com/device-allows-blind-people-see-...

[6] https://www.numenta.com/blog/2012/10/22/jeff-hawkins-at-stra... (link to video at bottom of blogpost). You need to login to download the PDF slides, so I made an imgur album of the relevant slides to this discussion: https://imgur.com/a/5ULak


You're aware that numenta is considered a joke in the serious machine learning community, right?


Yes, and quite unjustifiably so in my opinion, which is why I sourced all those links in my comment. From what I've seen, Numenta is very good at a specific class of problems (time series prediction and anomaly detection).

Feel free to underestimate them. It's really not my problem.

And do pompous, abrasive comments like yours really add anything to the discussion? Why don't you elaborate on why they are a "joke" to serious intellectuals like yourself.


Because they continously claim they're "ahead of everyone" and how everyone else is stupid and their glorious htm will beat everyone....and then proceeds to get destroyed by convnets...and then moves the goalposts to anomaly detection only...and then gets beaten on their own heavily rigged benchmark dataset.


Honestly, why is this news? :)


On the photo #4 you can see a man pointing a pretty insecure computer, windows xp :) Should Not be used xp anymore because there is no more security updates or I'm wrong?

The direct link of the image is: https://cdn.arstechnica.net/wp-content/uploads/sites/3/2017/...


The image has been used in a number of articles, the oldest of which I was able to find was from 2013 - I would assume the image is older than that even.


Enterprises can buy expensive support contracts direct from MS and still receive security updates. This is fairly common in big companies and with the government. There's articles about them paying for XP custom support from 2014, I'm assuming they still are.

https://www.theguardian.com/technology/2014/apr/07/uk-govern...


It's also possible that these are library photos, and not up to date.


They will reduce energy usage by 10% and then raise prices because they aren't making enough money. It's happened in Canada with Ontario hydro, and in California with water.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: