Hacker News new | past | comments | ask | show | jobs | submit | bwilsonkey's comments login

This couldn't be more wrong. Costco sells excellent quality meat at great prices.

e.g. Costco sells prime-grade brisket for ~$4/lb. where I live. You generally can't even find prime brisket at an average grocery store, and the choice-grade will routinely run $7/lb or more.


I do compare for the same grades of meat, perhaps it's a US vs Canada difference, but I've found this to consistently be the case.


I don’t think he actually did it. He linked to a crypto scam medium post lol...


> crypto scam medium post

Could you expand on that? I've only skimmed the article, but I don't see any "crypto scams" pushed in the article. The article is just about something the author seems to know about (algorithmic trading), applied to the cryptocurrency market. It does promote the authors project (why else would you write a Medium post), but in the worst case, that would be a normal scam and not a crypto scam.


The whole article just seems like an attempt to steal money from uninformed people. He starts by giving vague information about trading strategies in general, then linking to an article about Renaissance Technologies as an example for successful algorithmic trading, then stating that most trading bots aren't successful and that the crucial differentiator for deciding whose bots to trust is the person's professional experience, which is obviously a reasonable thing to do, however the picture at the beginning of the article of him at a trading desk and him repeatedly mentioning his 7 year experience as a trader combined with the complete lack of any actual proof that his bots are actually profitable, make it seem as if he just tries to profit off his previous experience.

He ends the article writing this:

"All you’ll need to get started is: 1. $1000 2. To press a single button to get the bots started"

Furthermore in the reddit comments in response to the following question: "130 papers re-implemented in 7 months? I'm blown away. Write a software engineering book about how you did it so quickly. Then write a self-help book about having enough motivation to see it through." he writes:

"100-hour weeks and a desire for a better life for the ones you love will get you there pretty quickly"

This guy seems like a complete fraud, I find it sort of sad that this has landed on the frontpage of HN with that money upvotes.


Oh don't get me wrong, I wouldn't trust the guy farther than I can throw him either. I just wanted to be a bit pedantic about "normal fraud" vs. "crypto fraud" (= ICO or similar).


As long as he's not even willing to publish the list of those 130 papers, I think it's OK to be a bit skeptical, yes.


Agreed — something about this post doesn’t pass the smell test.

1. He’s a profitable trader at a tier 1 firm who has the spare time to not only develop a series of algorithms based on 130 research papers, but also sufficiently backtest them in 7 months?

2. He said he looked at the past 8 years of papers, but refers to multiple models correctly predicting the 2008 financial crisis.

3. Where are the code samples?

Edit:

Lol, just realized his medium post ends with a crypto scam.


Ditto on the "130 papers in 7 months." I am not familiar with the field, but I assume the process would look like this:

* Read and understand paper

* Find and download appropriate input data

* Code paper model and validate (he said he wrote his own code)

I can see myself being able to do this for ONE paper in maybe a week. He claims he was doing 1-2 of these per day. Wow. So either there is some exaggeration on his part, or he is a total wizard in his field.


I think you are a quick study; typically it takes me a week to figure out the detail of what's being done in a paper. Getting the data and testing would take longer. Charitably he/she has a framework with all the data required sitting ready to go and is just writing wrappers to the models. But even downloading frameworks from Github and getting them working takes a couple of days - for me. For example I've been playing with the Graph-network code from deepmind for a few weeks - I had to learn how the graphs were represented, how to build them and access them and how the models were made and put together. Just working that out was a solid three day job. Now I can build things and test out what's going on in the examples and get a feel for the framework, probably (if there was a problem) I would be in a reasonable position to say "this doesn't work like they think it does" (it does, but no surprise) but unless you've done that leg work I think you can't really. I think a proper replication effort is really 1 man month of expert time - or really you're just throwing stones.


Depending on the complexity of the model it would take me at least a month for a single paper. What makes it fully unbelievable to me is the claim of detecting p-value hacking in many of these 130 papers while doing 3 papers every 2 days.

To make that claim for a single paper I would 1. have to be able to reproduce their p-value, and 2. spend enough time with the model to understand how/what assumptions were unfairly tweaked to get to that p-value.

Just running your own implementation of a model on your own dataset and getting an insignificant or different p-value is not enough. You might just have implemented the model wrongly.


KeyBanc Capital Markets | Equity Data Specialist | Portland, OR | Full-time Onsite | http://www.key.com

KeyBanc Capital Markets Inc., a division of KeyCorp, seeks an equity data specialist to join the KeyBanc Capital Markets First Look Data team in Portland, OR. The specialist will acquire, analyze, and present datasets that help our research team gain unique insight into company models and broader industry trends.

Working collaboratively with equity research analysts and the data team, you will help triangulate key performance indicators and populate investment theses via signals from preexisting and organically acquired data sets. You will use creativity and persistence to deliver actionable insight on a set of companies spanning technology, industrials, real estate, consumer, and healthcare.

You will own end-to-end delivery, working across programming, finance, and design realms to build and maintain data products.

Primary responsibilities include:

-Acquire data from diverse sources, including scraped web data and internal repositories

-Query and transform data

-Work consultatively with the data team and research analysts to identify key performance metrics

-Build self-serve visualizations and data-retrieval platforms

-Present findings in a compelling fashion

Desired Skills & Experience:

-Demonstrated ability (professional or academic) to manipulate large data sets (query, cleanse, join and update)

-Demonstrated ability (professional or academic) to succinctly and visually present findings from data

-Intermediate to advanced Python skills

-Interest in financial markets and understanding company models

-Demonstrated intermediate to advanced SQL capabilities

-Design skills (web layout, template design, BI dashboarding [e.g. Tableau], etc.)

-Familiarity with web-based frameworks such as Django


Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: