Hacker News new | past | comments | ask | show | jobs | submit login
Show HN: Bitesnap – Deep Learning Meets Food Logging (getbitesnap.com)
174 points by vinayan3 on Jan 25, 2017 | hide | past | favorite | 108 comments



I found your blog post (the learn more button) but I'd love more info on this if it's available somewhere.

How does it handle differentiating different types of bread, which have differing carbs?

How does it handle a thick layer of butter or another fat put on the sandwich in the Avocado Toast example, which would presumably be below the visible avocado?

A long time ago my friends and I offered a manual version of this as a service via sending pics / emails to us and us then manually going through and guessing. It worked well enough, so I have high hopes for a ML version!

My biggest pain point doing it manually came from pics of things like pasta where I couldn't really guess how much oil was in the sauce.

You can definitely get far with just estimating the macronutrients from a photo, and the absolute accuracy matters less than consistency in measurements over time.


> How does it handle differentiating different types of bread, which have differing carbs?

We don’t nail everything yet but we allow users to refine the predictions. So in your example we might predict bread and let the user pick the type.

> How does it handle a thick layer of butter or another fat put on the sandwich in the Avocado Toast example, which would presumably be below the visible avocado?

We don’t predict portion sizes yet. At the moment we give a sane default and ask the users to adjust it. The next time you eat the dish we bring back the past meal so you don't have to specify the details again. We’re hoping to start predicting some of those details once we get enough data from our users.

Thank you for trying it out and the feedback.


> We don’t predict portion sizes yet. At the moment we give a sane default and ask the users to adjust it.

This is a serious problem. Research suggests that one of the main causes of obesity in children is lack of ability to identify portion sizes or understand how much to eat.

Obviously there is a market of people who understand this well and want to track what they eat, but you are very likely going to be misleading a very significant amount of your userbase into making worse decisions for themselves.


Predicting portion size is something we’re actively looking into. One of the reasons for getting Bitesnap out to a larger audience now is to be able to collect more training data for doing this. We’re also experimenting with allowing users to specify portion sizes in more natural units -- for example by comparing a serving to the size of their fist -- and we’ll automatically convert these to conventional units. Finally, we’re also building more tutorials and help content into the app to educate people on better estimating portion size (among other things).


You might look into a custom cutting board or plate/container that has a fiducial on it so you can measure each item. That's also another revenue stream.


I spent a lot of time looking for a small bluetooth food scale that I could put under my plate but wasn't able to find anything. If I ever find some free time I might try to make one.

Would be cool if I could pull one out from my pocket, stick it under my plate, get a measurement then subtract whatever is left after I'm done.


so you want early users to be a guinea pig for potential future users, and receive inaccurate counts? i dont think there will be any future users if accuracy is an issue in the beginning.

this is one of those things that i believe you should have right from the beginning.


With regard to portion sizes, you could have users put a "standard" object in the photo, such as a dollar bill, credit card, teaspoon, or quarter, iPhone, etc. into the photo as reference sizes. Eggs are also reasonably consistent, so in photos with eggs, that could work also. Love the idea (I had it myself a long time ago, but too many things to do and not enough time) and hope this takes off!


Just downloaded it, the app is really elegant. Really nice work. I have used weight watchers in the past, so here are a few thoughts:

1. Its not totally clear to me what the goal of the app is. Is it going to help me lose weight? Help me avoid unhealthy foods? Why am I tracking? Do I get to choose why I am tracking? Tracking is a big commitment, so I would lead more with what the benefit is, to motivate me to decide to track.

2. I really love the weight watchers approach of boiling everything down to a single point count. I have been around WW long enough to see them change the meaning of the points to incentivize different behaviors. For example, raw fruits and vegetables are generally zero points, even though they clearly have calories. High sugar foods are higher in points than their calories would suggest. I find a point system much more useful than a calorie system.

Overall, if your goal is to help people lose weight, I'd suggest you look at what WW has been doing in their app, and also in how they have changed their point system over the years. I actually think WW overall (including the meetings) is an amazing system.

Interestingly, I have gotten to the point that I basically know the points of everything I eat regularly. Originally, I loved the WW app because it was so comprehensive, but now I just use a tiny notebook and pen. Its a lot faster than messing with the app.


Awesome, glad you like it.

Our goal at the moment is to focus on making the logging experience as simple as possible. Weight loss is one of the main use cases but we have a few beta users who are logging for health reasons, trying to improve their diet and even a chef who’s doing it for fun.

We’d like to make the app customizable enough to fit most of those use cases. We don’t want to push calorie counting on everyone and have an option in there to disable the calorie and macro cards. As we add new features we’ll let users decide if they want them to appear in their feed.

We’re considering adding a simpler point system, maybe even one that adjusts based on your goals.


Great start for a product. I would imagine everything that's currently in your food database had to be manually chosen due to the availability of training data. It doesn't handle very many branded products, but that could be really valuable in keeping users engaged and using your application daily, so I would suggest adding barcode scanning (currently building an application with the Nutritionix API, its got a great dataset). You could even ask your users to take a picture of their barcode entered food so you can start learning on a much wider variety of products. My two cents.


Hey, thanks for the feedback.

All of our nutrition data comes from the USDA right now. We don't recognize everything that’s in there yet but we map our predictions onto some of the “nodes” and let people refine the predictions to a more specific item.

We don’t recognize packaged products yet but plan on doing it once we have enough data. Barcode scanning is almost done and should make it into the app soon.


I wouldn't use this. It's not that I don't think your calorie count might be right for some cases, but it won't get everything right, and when you're trying to lose/gain weight, making sure you have accurate calorie counts is crucial (you need measuring spoons/cups, etc).

Which is why I'm sticking with MyFitnessPal. Also, I find that although it's tedious to keep count of calories in the beginning, once you get used to it, it becomes a game, and even fun.


This feels like this fills the void between people who don't care about calories and people who really care about calories. I know people (read: myself) who are too lazy to measure everything down the milligram and just want an order or magnitude sense of calorie intake.


I agree that this can be useful for this use-case. Right now I'm bulking so it's more important for me to have a general "good estimate" of my calories and macros over a week, rather than a hyper-refined daily view, where I need a granularity of 100 calories or I haven't lost weight that day.

For my situation, it's more about "damn, I ate 1000 calories over last week, oh wow lol it's because I got super stoned on friday and ate half a pizza, ok, so next week eat 2 eggs instead of 3 for breakfast to make up for it." This app definitely wouldn't work for a cut, though, because I need my measuring cups and spoons to do that right.


Thanks for the feedback!

With Bitesnap, you can enter exact cups, ounces, etc. if you want to refine your calorie estimate. One of our goals was to build a flexible tool where it would be quick to get a ballpark number, but also possible to get a very precise number if you put in just a bit more effort.

We think that the visual side of things will be useful to many people, even with ballpark calorie estimates. It's a great way of developing mindfulness of what you’re eating and improving and maintaining your diet.

Also, as we have been working on this it’s been a really fun game to see what we can recognize!


Looks neat, but at the same time, really cumbersome. Because it looks neat, I'd guess people will try it, because it will be cumbersome, people will abandon it. So like many of similar fancy ai/recognition apps, i think it's finding a problem for a solution, it's over engineered.

My problem is, taking pictures is more effort than picking an item off a list, as current caloric counters do.

Most people eat roughly the same things on a regular basis, so they'll end up ticking away a meal before/after you have it and be done with it, with this, you'd be taking pictures while having your meal.

another use case is planning a day ahead, again, pics don't work here, can't take pics ahead of time.

And of course, the result from the pics have to be corrected, so the app learns, it seems easier to just get the item from a list immediately, without having to take a pic first, auto completion on an input works wonders (though of course you'd pick from a longer list)

Maybe i'm just old or not enough of a techie, or photographer, but for me typing a short text, even on a phone, is actually faster than taking a pic.


> My problem is, taking pictures is more effort than picking an item off a list, as current caloric counters do.

How can this be? Taking a picture is at most 2 taps, if it pre-fills 90% of your list (even 40%) it has saved you numerous taps.


My goodness, I just started trying to use MyFitnessPal to track my food intake and was wishing that something like this existed.

Is there a way to save prior entries as meals? I am a boring person and I eat the same thing for breakfast 7 days a week. I would like to just add this with one click instead of selecting: Eggs... Spinach... Oatmeal... etc. MyFitnessPal has this and it is a great time saver.


Hey. We recognize your past meals, so if you take a picture of something again it will let you copy the entries.


Just to add to what m_ke said, we do a visual search to recognize when you're eating a meal that's the same as one you've logged before. You can copy all of the items, including the portions and customizations that you entered before, with one touch.

One neat thing is that our model has figured out which features are relevant to this task -- so tomorrow, even if you eat eggs, spinach, and oatmeal in a different container or at a different place than you did today, we can still recognize that it's the same thing.

You can see this in action in the first shot of our demo video:

https://youtu.be/Uw6kjbiFcNs


I took a picture of my finished Styrofoam hot chocolate cup and straw. Hot chocolate was the first guess. It gave an option for "from mix water added", done. Very cool. Edit: just logged my coworker's lunch in about a minute: Cucumber, grill cheese sandwich in one picture, it let me add them from a list. Ramen soup took another try at a lower angle.


Thanks for trying the app!


Thank you for checking out Bitesnap. We really appreciate the feedback and comments.

We have a blog post up explaining more about Bitesnap and why we built it. https://blog.getbitesnap.com/introducing-bitesnap-a-smart-ph...


Awesome, I was just talking about how I'd want this kind of product!

I'd be more than willing to pay a monthly fee ($5 / month) for you to have someone confirm the details of my meals and label meals that your system doesn't recognize. I'd be happily paying you to build a higher quality training set because

1) I don't want to fill out the extra information (although your interface makes make that process less painful than it would be otherwise)

2) Paying would make me much more likely to be a consistent user

Anyway, excited to try it out and if you every try a paid upgrade I'll definitely be a guinea pig :)


Can you make this available on other countries' stores as well? Specifically I'd love to see it on the German iOS AppStore.


UK as well, please.


This is great! As another diabetic, this type of software is HUGE in allowing those of us with dietary restrictions to eat with a bit more freedom, or maybe less anxiety. While nutrition labels are great, going out to eat means you're often left to guess how many carbs you're intaking. Anything that provides a more accurate assessment of my carb intake is great.

I am about to get my SCiO unit which provides a means of sampling small amounts of food to determine the nutrition facts. The minor issue here is that it doesn't provide much in so far as what the total amount of carbs is, only the carb density.

I could see this product working alongside a SCiO type device that can get the macro assessment of of food you're going to be eating, but then get the nitty details by hooking into the SCiO data on the spot. If bread is detected, "Please get more accurate details on your meal by sampling your bread with your SCiO-type unit".

Great stuff! Keep it up!


A few thoughts:

- Very slick onboarding experience, especially compared to other calorie counters. Big plus here.

- There doesn't appear to be a way to add food outside of the current "meal." I'm sitting here at lunch time, but wanted to add what I had for breakfast --- instead I've just eaten a very large lunch.

- The current database of foods seems pretty slim. No entry for my African Peanut Soup, for example, which is available in both LoseIt and MyFitnessPal.

- How will you deal with things like sandwiches, where many of the ingredients may be totally hidden from view? Guess "sandwich" and let me pick what's on it from a sensible list of sandwich ingredients? Same goes for soups, or stews, or anything that can be visually similar with a wide range of possible ingredients.

Over all a good start, and some much-needed innovation in the calorie-counting app space.


> Very slick onboarding experience, especially compared to other calorie counters. Big plus here.

Glad to hear that you liked it

> There doesn't appear to be a way to add food outside of the current "meal." I'm sitting here at lunch time, but wanted to add what I had for breakfast --- instead I've just eaten a very large lunch.

Yeah we had a bunch of people asking for that the past few days. We should have that fixed in the next release.

> The current database of foods seems pretty slim. No entry for my African Peanut Soup, for example, which is available in both LoseIt and MyFitnessPal.

All of our data comes from the USDA right now. We’re going to add barcode scanning soon and that will include another 70K items. After that we plan on making it easier for users to add new things by OCRing the nutrition labels and computing the nutrition values from ingredients/recipes.

> How will you deal with things like sandwiches, where many of the ingredients may be totally hidden from view? Guess "sandwich" and let me pick what's on it from a sensible list of sandwich ingredients? Same goes for soups, or stews, or anything that can be visually similar with a wide range of possible ingredients.

For more complex items we have these “builders” that let you quickly adjust and add common ingredients to things like sandwiches, salads and soups. As we get more data we’ll use ingredient correlations and predictions to make the suggested additions more accurate.

The app also learns to recognize your past meals so you quickly copy the information for meals that you eat often.


I tried out an old photo from December, and it noticed that the date was not today. I was asked if I wanted to add it for that date. I was also able to change the time on the edit meal screen. I think this should be on the input screen - similar to Google Fit. When I am entering an exercise, the time defaults to now, but I can select another time just as easily.


This is cool. When I am aware of the nutrition in my food (especially trends over time) I eat better, but I always get sick of the tracking tool.

Putting it out there: I would pay a lot of money for a consumer tech wearable or even implant that would track calorie consumption in the background.


That's what we noticed as well. Calorie counting is valuable on it's own, but we think that the biggest benefit of tracking what you eat is the awareness that it builds up. It really helps you figure out what the weak points of your diet are.

Having this integrated into google glass or spectacles would be great.


Heh, a few years ago I thought of this same thing. I didn't know anything about ML so I thought it'd be possible by doing the following (for food served in a restaurant/fast-food)

1. Tag the location (if you go to McDonalds and take a picture of a Big Mac you'll see that you're at McDonalds and you have a picture).

2. Then, to get your "nutrition" info you have to manually specify what you're eating.

3. What you're eating would then be matched to a database that would provide the nutrition information. The picture basically would be there just to show you what you ate.

---

This looks WAY better than that.


That’s another feature that we’d like to add to the app. We have a way to match up similar looking meals so if we knew that a user is near shake shack and had examples of their burgers we could predict the exact item.


Very nice!

About 10 years ago I created a cooking web site that also uses the USDA nutrition database (cookingspace.com) and I just recently started working on free iOS and Android apps that will use the improved analytics code that was originally used on my site.

I am playing with the Android version of your app right now - it so far has done a good job recognizing food items pulled out of our refrigerator. I also like that as I take a picture that it does not add the image to my local pictures (since these are automatically instantly backed up to OneDrive and GDrive).


Very cool! I shared a small project to demo and explain how I used convolutional neural networks to classify food images: http://blog.stratospark.com/deep-learning-applied-food-class....

I'd be curious about the calorie detection. I'm wondering if it's using some kind of weighted sum of image segmentation proportions, or doing end-to-end deep learning.

Anyway, cool product, love to see where it goes!


Hey, that's a really cool project.

We haven't tried going directly from image to calories yet, and I'm not sure that we ever will. Instead the plan is to do end-to-end portion size prediction for some of the classes. Segmentation would be cool but it's really hard to get the data for it.

By the way, plotting images with matplotlib is a pain. Try using HTML with base64 encoded images instead. Something like this should work:

  def base64image(path_or_image, prefix='data:image/jpeg;base64,'):
    s = BytesIO()
    get_pil_image(path_or_image).save(s, format='JPEG')
    return prefix + base64.b64encode(s.getvalue()).decode('utf-8')


  def show_images(paths_or_images, predictions=None, sz=200, urls=False):
    from IPython.core.display import display, HTML
    predictions = predictions if predictions is not None else []
    img_tags = map(lambda p: '''
      <div style="display: inline-block; margin: 2px; width: {sz}px; height: {sz}px; position: relative">
        <img src="{b}"
             style="max-height: 100%; max-width: 100%;
                   position: absolute; left: 50%; top: 50%; transform: translate(-50%, -50%);
                   border: {bsz}px solid rgba(255, 0, 0, {pred});"/>
      </div>
      '''.format(b=p[0] if urls else base64image(p[0]), pred=1 - p[1] if p[1] is not None else 0, sz=sz, bsz=5),
                 zip_longest(paths_or_images, predictions))
    display(HTML('<div style="text-align: center">{}<div>'.format(''.join(img_tags))))


If this really works, consistently (i.e. within some reasonable margin of error), Fitbit should buy this intellectual property immediately and promote it as a way to make calorie-intake tracking dead simple.

For myself, and for many others, calorie-intake tracking was/is one of the last hurdles jumped before weight loss/maintenance efforts really achieve great effect. It's such a pain (time-consuming, tedious) to do it manually, especially if you have any reasonable amount of variety in your diet.


> For myself, and for many others, calorie-intake tracking was/is one of the last hurdles jumped before weight loss/maintenance efforts really achieve great effect. It's such a pain (time-consuming, tedious) to do it manually, especially if you have any reasonable amount of variety in your diet.

That's one of the main reasons why we ended up working on this. I was pretty overweight as a teenager and lost over 60lbs in one Summer by really paying attention to what I ate (and exercising). I tried using a few of the calorie counting apps but they felt like a chore and really nudged me to use packaged products since I could scan the barcode to log them.


Food logging is a main entrance for household "ERP". If being implemented right, there are so many potential use cases. But I guess it's easier to extend from those chained things behind the entrance to the entrance instead of extending from the entrance. Anyway, it's good to see there are someone working on the hard part of food logging.


I just logged a couple of meals with it - really like the slick interface - it's very intuitive. One of the main things preventing me from using other services is how clunky / frustrating they are. This is a good balance between "close enough" and too much tedium. Nice work!

One other area of feedback - while the onboarding was slick, I felt the hours of activity to the labeled "level" seemed a bit off. For example I do a high-intensity workout almost every day of the week for over an hour either: strength training or cardio and the level of activity for 7 hours per week only put me at "lightly active" (I forget the actual terminology and cannot restart the onboarding screens without uninstalling). I was just curious how you came up with the activity scale.


When I read this: "We saw an opportunity to apply recent advances in image recognition to simplify the food logging process"

do they mean CNN's for image classification and/or recognition? Does the app estimate the distance and the portion size and if not, how feasible would that be?


Yeah, it’s a pretty standard conv net. Right now we’re only recognizing the foods in the image. We’re hoping to start predicting the portion sizes for the common foods once we get enough examples from our users.


If this has MyFitnessPal integration I'll start using it immediately. Would be great to supplement my tracking when I go out to eat. Your guess is as good as mine, hopefully better, and much easier for me to snap a picture than do the guessing.

Would it integrate via HealthKit perhaps?


I'd second the healthkit integration! I track all my info via healthkit, I'm a diabetic so that includes blood glucose readings, activity, weight, and of course food intake with carbs being the most important. Healthkit is good (really it's just okish but it's the best we have IMO) platform for me to get a complete view of my health information and it hugely valuable for my Doctor.


We plan on adding healthkit integration soon. Not sure if we could do anything to provide deeper integration with MFP.


+1 for integrations. I would use this if it could send data over to Google Fit or MyFitnessPal or Fitbit etc.


+1 for Apple Fit and myFitnessPal integration.


Can you describe your training or dataset you used?


It's a convolutional neural net that's very similar to the one that won ImageNet last year. We're doing standard preprocessing with opencv and training the net in theano. Dataset is a mix of images we got from our beta users and stuff that's openly available on the web.


I think the food-id code can be used for more stuff if it works. I always wanted to snap a picture of my refrigerator and have my available food updated in some database. Then hook it up to some recipe database and we're talking (you cold cook X,Y,Z with what's available or if you'd buy A,B,C...integrate that with some food delivery service to monetize).

I'm suspecting this is a concierge MVP of sorts...if not consider me quite impressed. The last time I checked food identification (or portion sizes really) was a fairly hard problem. Edit: Guess the food identification isn't that hard anymore. Yikes times are moving fast :D


There's an example of this kind of classification in the Faster R-CNN paper. Microsoft's implementation actually give a fridge as the example:

https://github.com/Microsoft/CNTK/wiki/Object-Detection-usin...


Love this for prepared meals and home cooked meals, where you need to analyze the ingredients of a dish. Have you considered adding the ability to recognize a standard FDA Nutrition facts label as well? For example if I have a protein bar, it would be great to snap a picture of the label and OCR the macros. I know that differs though from your current tech of a CNN for image analysis, but it would round out the product to cover a greater percent of foods eaten.


Yeah, it’s actually something that we played with already. I have a basic prototype of it working and might be able to get it into the app in the next few months. It should really reduce the friction of adding new items. At some point we’d like to start recognizing packaged products but we’ll need the data to make it work.

I mentioned this here as well: https://www.reddit.com/r/MachineLearning/comments/5ol7od/d_w...


I love the idea and the technology and design seems great! However, I found out that I save way more time and am more effective just sticking to a few basic principles of eating food instead of tracking everything. Slow-carb diet where I eat no bread until one cheat day a week - works so far!

The benefit of pictures seems to be that I'm forced to think about what I am eating before chowing down.


I've been using it all day and have the following feedback.

1. I like the design, and it works very cleanly. Nothing appears to be heavy, and the UI is intuitive.

2. It only does one thing - track food. I think this is it's biggest strength. It doesn't do fitness or anything else right now, which it shouldn't.

3. The default goal to lose weight is simple to use and I think captures the predominant use case - by calorie reduction. I think the goal breakdown with consumed/remaining should be the most prominent UI element on top with the breakdown by calorie type being second. Having the challenge details (# meals X days) at the top isn't data I need each time I open the app. I can see why that would be a design challenge.

4. I have only used it a day so I can't say how well or not it displays trends about calories/nutrition over time, but I know that I would like to be able to break things down more.

5. The biggest challenge I think you have is with CV. Correct me if I am wrong, but my guess is that you are trying to use users to do reinforcement learning on your Deep Vision Nets. I am a Deep Vision guy myself (which is why I downloaded it by the way) and my guess is that you are going to have a hard time doing training this way. Here is why:

A. If the results of the object classification are good enough to always be result #1 (because it's obvious you are using a probabilistic return set (imagenet?)) then over time people will be annoyed at having to select the object/food in addition to taking the picture.

B. If the results are not good, then people will get annoyed with having to take the picture AND ALSO enter the food type. They will just resort to entering it manually each time. For example I made vegetable curry for dinner, and didn't take a photo because I knew it wouldn't know what it was.

So as a result, your training set will stagnate and won't learn any better than if you did it with a team of people. If you want it to really learn you're going to have to incentivize or force people to always take a picture and always tag it. Even better if you can have them bound each item right?!

By the way, crowd sourcing Machine Vision training is I think the right way to do things (that's what we do with interior home objects FYI).

I look forward to seeing iteration here. Best of luck.


Hey, thanks for the thoughtful response.

We're not doing any reinforcement learning, we just fine tune the net as we get more data (and occasionally train from scratch when we add a lot of new classes).

In regards to (A), we plan to start skipping the selection steps for predictions that we're really confident in and will just add them by default. I think once we have enough data we might even be able to predict what users will eat before they take a picture. I eat practically the same thing for breakfast every day of the week so it could just log it for me without requiring me to do any work. Same goes for things like coffee shops, we don't ask for location right now but if you always get the same thing when you walk into a coffee shop, we could just log it for you based on the fact that you were there.

B. We keep track of our predictions and what users end up logging so we can tell what our weaknesses are. When we train new models we prioritize the weakly performing classes, especially if they're popular among our users.


Awesome, I've just tried it with an Apple and it worked!

Can you let me enter my height and weight in metric please as I had to use Google to convert.


Thank you for trying the app and the feedback. We will add metric units for entering your details.


"This app is incompatible with all of your devices".. :( I have a Note 4. What sort of device do I need?


Our Android app should work on any device running KitKat or above. But at the moment, we're only available in the US and Canada.

We're working on supporting more countries, but localization can be difficult -- we have to worry about things like foods and preparation methods that are local to a region, etc.


This is probably due to contry restrictions. I get the same "incompatible with all your devices" on the website, but "This item is not available in your country" in Google Play app.


Same here. What's up with the country restrictions?


Yes, I'm in the United Kingdom and getting the same message.


Got the same on nexus 5


I'm in the UK and managed to install it without any issues other than the measurements all being imperial.


From the looks and description it looks very interesting to me, would love to try it - however I am unable to install it on Android. Currently in a European country (also connected via Swiss VPN), might this be one of the reasons? Is there any reasons you might have geofenced it for now?


Is there an API? Can I export my data?

I'd be interested in using this if I can get the data out. I've been posting everything I eat and drink to my own website for the past few years, sometimes with photos sometimes just text. I'd love to have a better workflow for doing that!


We don't have an API yet but we'll add a way to share/export your history at some point in the next few months.


Very cool gonna check it out. Have used Calorie Count in the past but input was the main drawback there.

I noticed that the image of the phone with app on the homepage takes a few seconds to load on a mobile connection. Might want to optimize that for faster loading.


I made something like this last year https://devpost.com/software/picknic-ym5txf

It was just clarafai -> keyword search in USDA database -> log though.

Looks cool!


Nice work! The Clarifai API is pretty awesome.


Sounds like an amazing product - would love to try it here in the UK.

It might help to add a mail subscribing list to your website to capture early oversea adopters.

Does the app allow you to take a picture of the aftermath to account for the unconsumed left overs ?


Good idea! We had a beta signup form up there before but just took it down today. We might bring that back for people outside of the states.

We don't predict portion sizes yet. We could try adding that once we do.


Please do. Otherwise I shall be forced to side-load your app, and I'd prefer to get it through Google Play (seriously, I have wanted this product for a long time).


Hey, add your email here https://goo.gl/forms/ZP2bQOL5aCS1NjlR2 and we'll invite you to our beta group.


Ok, this just seems a bit unbelievable. I'd love to be proven wrong though.


Hey overcast, you can download the app on the App Store or Google Play and try it out for yourself. Let us know what you think!


App Store telling me the app is not currently available in Australia.

I'm guessing there is nothing specifically US centric in the app, any chance of opening up region support and sharing the love with your mates down under?


If you're still interested in trying it out add your email to this list https://goo.gl/forms/ZP2bQOL5aCS1NjlR2 and we'll send you a beta invite.


Hey this looks great - but this is what I see in Australia: http://imgur.com/H62UBYE when I search the appstore.


Yeah that sucks. If you'd still like to try it out you can sign up here (https://goo.gl/forms/WQ2VOJwRsn9yWTfC3) to get a beta invite.


Thanks - submitted!


How realistic is it that you could create something that gets attached to your stomach and is constantly monitoring your food?

Cool service btw. I remember a company that did this with Mechanical Turk lol.


More likely: subdermal implant that monitors nutrients in your blood.


I actually had this same idea about a year ago. A dietitian I know and told me it would be very hard to infer calories just from a picture, so I did not pursue the idea.


I'd love to switch to using this full time but why isn't available in my country (Pakistan)? I currently use Lifesum to track what I eat.


I just made a beta signup form for people that aren't in north america. If you sign up we'll just add you to testflight or our android beta.

https://goo.gl/forms/WQ2VOJwRsn9yWTfC3


Why have this restriction at all? No-one cares about localization in my country (Sweden). Just give us exactly how it works in the US and 99% will be happy.


At the moment, we're only available in the US and Canada. We're working on supporting more countries, but localization presents some unique challenges for us. Commonly eaten foods and preparation methods vary quite a bit from region to region.


Ditto for Australia. Is there a seperate review process for each country, or is this just the app publisher ticking only "USA" when they publish?


We just made a form for people who couldn't access it but would like to try it out. We'll just add you to our beta if you fill it out.

https://goo.gl/forms/WQ2VOJwRsn9yWTfC3


Unavailable in Australia. Weird, because this is the one way of logging food that doesn't differ between countries.


Unavailable in Switzerland either, where a lot of people also understand English.

Meanwhile another app with the same name already exists and can be downloaded in Switzerland - http://bitesnap.appstor.io/ - thought this was the one; very buggy bad experience. Oh dear.

Can understand not wanting to launch in non-English speaking app stores, for the risk of attracting negative reviews. But still for this app, would it really hurt? The marketing right now is to the English speaking audience...

Frustration all around

edit: just saw https://news.ycombinator.com/item?id=13484472 - OK fair enough :-/


We're being a little bit cautious in expanding beyond the US at the moment because commonly eaten foods and preparation methods can vary a lot between countries. We also don't support metric serving sizes yet. Hopefully, we'll be able to support Australia soon!


It would be nice to at least provide a way to try it outside US/CA, like giving a link to an APK here, especially because it's a «Show HN» what means you want to show your product to this community. I wouldn't be too concerned about negative reviews as A) non-hackers will not find this link and will not know how to install it anyway; and B) most people here would be smart enough to understand it's a prototype that is currently limited to one culinar culture, if you explain it while providing the link. This would avoid the frustration that probably more than half of the people are having here when wanting to try your app.

It looks damn cool, but I'm really sad to not being able to try it just because I don't have the right IP address :-(


Hey, that's a great idea. I just made a beta signup form for anyone who'd like to try it out.

https://goo.gl/forms/WQ2VOJwRsn9yWTfC3

We'll just add you to testflight or google play beta.


I would have installed your app if you didn't have region restrictions, now I'm not going to bother.


Hope I'm not spamming too much with this link, but we just threw together a google form for people who are outside of north america and would like try it out. We'll just invite you to be beta testers.

https://goo.gl/forms/WQ2VOJwRsn9yWTfC3


Looks amazing - but is itpossible that you vannot download is in Europe (Belgium- - did not find it in google play store.


How accurate does this predict...carbs? And how does it handle the different amount? Like different cup size?


How accurate is the calorie count? And out of academic curiousity, how did you validate the calorie counts?


Seems pretty neat so far.

Only imperial units though :(


We’re initially targeting the US market, and it was easiest to get serving data in imperial units to start. We are planning to add sensible serving sizes in metric units as soon as we can.


Great start!


How's it compare to HungryBot from infinome?


Why its not available for all countries?




Consider applying for YC's W25 batch! Applications are open till Nov 12.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: