Hacker News new | past | comments | ask | show | jobs | submit login
Show HN: Neutron Academy, Google Assistant Powered Learning (neutron.academy)
69 points by cyberpanther on Aug 1, 2017 | hide | past | favorite | 22 comments



I'm thrilled to see apps like these, because I believe the greatest potential AI has is to help people be really smart and educated with better judgment. Whether it improves critical thinking by asking challenging questions, or it has you think through complicated scenarios (even just things in your life), or teaches subject matter in an adaptive way that makes it really easy to get things, or just makes learning come to life (your own personal professor).

It's sad none of the leading voices in AI talk about this potential, have they ever? The ratio of talk about apocalypse or singularity to things like this probably tells you something..


They need to add a way to tell the system it has a bad question/answer combination (in case any of the devs are on here).

Example:

Question 2: What is the branch of science that studies the composition, properties, reactions and the structure of organic compounds?

I responded with Organic Chemistry, but it only accepts the answer Chemistry. While technically true, if someone is trying to learn, then this is misleading. That question specifically defines the organic branch of chemistry, but does not define it as a whole.


Totally understood. Part of adding a quiz allows you enter multiple accepted answers. And quiz quality depends on the person who entered the quiz. The intention is not for us to come up with curriculum for everything but for people to come up with their own quizzes which they can distribute to a class.

However, that particular quiz is one of our "launch" quizzes, so I went ahead and added that answer. Both chemistry and organic chemistry should work now.


I'd expect quiz creators might find embedded feedback from users to be of value - "Report a problem" or "Send comment to author". Even with play testing, quiz bugs turn up in use.

> Totally understood.

At least for me, I've found learning from user testing feedback to be like consulting, in that confidence that I totally understand what's going on, is a big red warning flag with sparkles. :) In part, because getting from what users say, to what they need, is such performance art.


In the web based text version if I type out my answer too fast, it seems to truncate some of the characters, e.g. I typed forbade and hit enter but it parsed only "forba"


I'm guessing you started in voice mode and then went to text so your mic is overwriting your text. If you stop the mic or use the text button, it shouldn't do that.


I'm seeing the same behavior in Chrome on OSX.


Random quiz gave me "What is ten times one" ... "What is ten times ten" in order.


User experience: Came to N.A from HN to explore. Saw a long page of "Latest quizzes". Looked for but didn't find a "Recommended quizzes". My experience is most quizzes are poor, so this was discouraging. I chose a topic of interest, but which is often taught badly. "The Periodic Table of Elements Part One". The quiz page gave me a title and number of questions (57), and clicking on "Start quiz with text" gave me the first question. Being on firefox, I expected voice to not work. I considered switching to chromium, but didn't. I thought "this will be a long quiz". I looked for a way to see a list of the quiz's questions, but didn't find one. Rather than beginning to take the quiz, I left the site. Insufficiently engaged for a long quiz while unable to visualize what success would look like. Additional context: I've worked with speech in education, so that had reduced novelty pull for me. This report has been somewhat simplified. Fyi, FWIW.


What an awesome idea. Would love to hear more about the stack & roadmap. I see lots in progress--

I see vue.js, and some material design layered on top. This would be a fun project to put together w/ Angular+Firebase+Cloud Functions.

Would love if these were 'embeddable', and CRUD w/ API.


Definitely, lots more planned. This is just the launch product. I've been thinking of integrating other platforms like Amazon Alexa. Also embeddable and APIs would be possible. Let me know your use case. You could actually hack the API now since its a SPA architecture with a GraphQL API.

As for the stack the frontend is VueJS and Vue Material. The backend is Python, Django, Postgres, Elastic Search and some Redis spread in for caching.

Its running on pretty limited resources right now, but I'll scale it up when I get some traction. So if it's a little slow sorry not sorry, but it seems to be holding up pretty good for using mostly free resources.


Great idea. I tried the verb tenses quiz. It struggled a little to understand me. The voice is also not the easiest to understand.

Edit: Doing a multiplication table drill that includes multiple tables would be useful.


Very cool way to explore the limits of voice interaction. Had to "Skype" a lot of hard questions. Turns out "Anthropologie" isn't a branch of science.


The answer matching could use some work.

In addition to the order of a list being unnecessarily important as mentioned by other users, in the "Periodic Table" quizzes, any Element codes that form a word are interpreted as a word, so your system thinks that "He" (rhyming with she) is the correct answer for Helium, rather than H.E.

I didn't personally test them out, but I imagine Astatine and Beryllium would exhibit the same behaviour, and there are probably more.


It's kind of frustrating that the reply to an incorrect answer is a repeated "incorrect, try again." After a couple times, I just want to know the answer.


Cool! Not a huge fan of the color scheme/layout, but it's an interesting concept.


Shameless plug: We have a similar project using Google Chrome web speech API.

https://news.ycombinator.com/item?id=14902391


I love this! I would love to integrate this with a deduplicated, taxonomized, and crowdsourced database of facts I am working on - branches-app.com/theplan. Would love to get in contact with you


send me a DM on twitter https://twitter.com/pizzapanther


It asked, "What are the 3 medals given in the Olympics?"

I replied, "Bronze, Silver, and Gold."

"Incorrect," it said.

Reversing the order was needed.


I tried gold, silver and bronze. It said incorrect.

Then I tried gold, silver, bronze. It said correct.


Another shameless plug: triviatemplate.com Create a Google Assistant trivia game with only a Google Sheet.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: