Hacker News new | past | comments | ask | show | jobs | submit | samhld's comments login

The data is stored in TiDB and queried from there. Is there a reason you assumed it was dynamically loaded in full via the Github API?


Yesterday it was taking 10+ seconds to load each dataset. Occams razor suggests missing caches in such a case (since the data doesn't change with any meaningful frequency).


Interesting. I've never seen it take more than a second, or maybe two.


GPT-3


The concept of "accredited investors", itself, is a great data point for why our implementation of capitalism isn't really capitalism. When opportunities are given explicitly to the rich, you don't have a free market.

I know that's not 100% relevant but it was therapeutic to write it.


Thanks for spreading the word. The DTCC portion of this sounds like they had a "liquidity problem" in the sense that they were trying to prevent one.

The portion about not having a plan for when market makers don't want their action -- and they have to instead pay to use exchanges -- seems like a pretty stupid problem they set themselves up for.

Do have this right?


I don't think the comparison to liquidity problem is accurate. The brokers cannot mint credit out of thin air if they don't allow margins. Hence, the cash in the broker is backed 1:1, and they won't have liquidation issues.

If Robinhood cannot fulfill these trades on margins, that is totally fine, they can halt on the margin trades if there are "liquidation issues". For all-cash trade, there shouldn't be any liquidity issues otherwise it calls into question how they manage their deposit.


The liquidity issues arise in part when trades occur on unsettled securities. Consider what happens if someone buys a stock, then turns around and sells it the same day to another counterparty. The broker has an IOU from one party for the stock, and has issued an IOU to another party. Two days layer, if the first party fails to deliver (oops, a short seller couldn't find stock to cover their short!), the broker's on the hook, at least in the short term.


Every RH trade is on credit, because it allows trading before ACH deposits settle and allows withdrawing or trading into other positions before trades settle.


Thanks, I think that perfectly answers: "it calls into question how they manage their deposit." this part :)


I'm confused why you're bringing up "blocking people from selling". Has that ever happened? It certainly isn't happening here.

Edit: I'm guessing it's because the above comment said "only BUYING". I think that was a distinction from blocking both buying AND selling. If you block only one, it forces the market.


(InfluxData solution architect here) Boolean is supported. You query it in the WHERE clause. Try `SELECT MAX(temperature) FROM ... WHERE temperature > 10`. That said, I'm not sure why you'd run a query like that in InfluxQL as it's the same as `SELECT max(temperature)`. :).


It's not the same thing. 'SELECT MAX(temperature) > 10 FROM ...' gets you all the datapoints, with a value of true or false. Moving it to the WHERE clause only gets you the datapoints where temperature is > 10. Yes, you can fill with 0 after the GROUP BY, but if all datapoints are less than 10, you get nothing back from the database. That's confusing: were all datapoints less than 10, or was there nothing registered in the time window I'm querying? Impossible to tell the difference. Not to mention some user interfaces just bailing: no data. I need to show a chart with a timeline of this condition being true or false.

Plus my actual use case is even more complex, not only do I need something like MAX(temperature) > 10, I need (MAX(temperature)) > 10 && (MAX(temperature) - MAX(dewpoint)) > 4.5).


Yep you're right -- oversight on my part. To be clear, the "not sure why you'd run a query like that" was referring to doing it the InfluxQL way (which is not the same thing)...where your results would end up being the same controlling for time range.


Question for influx solution architect:

How do you delete points from a specific measurement in a specific retention policy?


not a solution architect but, just delete by a very specific timestamp. this is not possible if you are writing at coarse time precisions, so don't do that.

`delete from meas_name where time='2020-07-22T15:40:58.375506762Z'` - just tried that and it deleted one row.


You didn’t specify the retention policy...?


Blame the customer. Nice solution, architect!


Huh? I don't mean to come off as blaming anyone for anything. Trying to offer a work around for their problem with InfluxQL. They usually exist. They may not be good enough for some but offering them doesn't hurt. In this case, I certainly misunderstood the need though.


(Solution architect at InfluxData here) Out of curiosity...what was the need for this DELETE? Deleting (not dropping) being somewhat of a "second class citizen" was a design choice to make room for more pressing time series needs. In my experience, `DELETE`ing is rarely necessary.


We work a lot with forecast data and sometimes mistakes happen and a forecast gets written to the wrong time series or for the wrong period. In these cases we do not want to drop the complete time series but only delete the erroneous parts.

While it is indeed a rare occurence, it is also very slow.


I often use metrics for giving reports via Grafana. They are usually 99% correct which is good enough for many cases. The benefit of doing so is that you already need and have nice dashboard for devs to follow what is going on with the system real time, and its just grafana account distant from customer. So we do this on several big gov systems. Sometimes customers complain that there is a slight difference between real state and what metrics show but its not a big deal and rarelly happens (when you have millions of things, is it really important to know 100% precise value in majority of contexts?)

Recently my colegue was testing some script and added some huge numbers on metric that is used for one of those reports. We had to delete those tests as customer complained that now his total invoice number jumped to trillions.


It was actually a bunch of data that was inserted coming from an IoT device that was invalid. So just deleting that time period and limiting to a tag would have been sufficient.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: