Hacker News new | past | comments | ask | show | jobs | submit | mostlystatic's comments login

There is some nuance here in that accounts without a standard charge and a free debit card were common before 2016.

The 2016 change seems to be about fully fee-free accounts, i.e. no overdraft charges or similar fees.


The change (and the Directive) was the result of banks withdrawing fee-free basic banks across the bloc as a cost-cutting measure in the wake of the 2008 financial crisis. The Directive reverse that and made basic bank accounts universally available across the bloc.


Local overrides are super useful for testing site speed:

• Your local setup is likely different from production (not serving from production domains, not using image resize services, using different HTTP compression...)

• You might not be able to run tests on localhost, e.g. if you're an external consultant or working in technical SEO (who often want to give specific recommendations to devs as dev attention is scarce and expensive)

There are still some limitations of testing changes in DevTools:

• Testing a fresh load is a pain (need to clear OS-level DNS cache, need to clear the connection cache in Chrome, need to clear cookies/service worker)

• DevTools throttling doesn't represent server connections accurately (you need to throttle at the OS level for accurate data, which slows down your whole computer and requires admin rights)

WebPageTest [1] and my own tool DebugBear [2] now support experiments to make it easy to try out content changes in a controlled lab environment.

[1] https://product.webpagetest.org/experiments [2] https://www.debugbear.com/docs/experiments


Is this an actual sales objection that comes up? Are potential customers saying that making the code open source would address their concerns, or would the cost to host and maintain the software internally be too high? Would code escrow be an option, so the source only becomes available if you go out of business?


No! But a lot of non-qualified-customers talk up the benefits of going fully open-source, so I take your point it may not be the best audience to be taking advice from? That's a nice idea about code escrow (although I'm slightly troubled by the incentives it creates ha ha!) -- had never heard of that. Do you have any links?

Cost to host should be OK, it's basically fire and forget, ignoring any customizations. Requires a bit of expertise if you want to modify things, so maybe that's an issue?


Same – was unable to create new VMs in all regions between 7:15am and 11:41am UK time. Not limited to France.


Don't know much either, but I found this Money Stuff story interesting: https://www.bloomberg.com/opinion/articles/2023-01-04/privat...

Someone was CFO at two companies and the auditors only checked the year end balance against his falsified statements. So he transferred money from the other company temporarily to make them match.

"""To avoid detection, Morgenthau doctored African Gold’s monthly bank statements by, for example, deleting his unauthorized transactions and overstating the available account balance in any given month by as much as $1.19 million. [...]

Morgenthau knew that African Gold’s auditor would confirm directly with the bank the actual account balance as of December 31, 2021, as a part of its year-end audit. [...]

Morgenthau deposited more than half a million dollars of Strategic Metals’ funds into African Gold’s bank account on December 31, 2021, because he knew that African Gold’s auditor would confirm the account balance as of that date, in connection with African Gold’s year-end audit. """

https://www.sec.gov/litigation/complaints/2023/comp-pr2023-1...


Interesting. I guess that is the inherent flaw of all audit methods which predominantly check the paperwork, while rarely venturing out into the real world. With sufficiently bad actors, the whole paperwork can be doctored and completely untethered from reality. Such bad actors need to only make a plausible Potemkin village for the controllers in selected spots where they are expected to verify if reality matches presented paperwork.


Enron was doing similar trick by selling buildings to another business entity, and buying them back after the audit. I might not have all the details correct but it was the same type of shenanigans. :-)


I wondered about that when reading the Money Stuff article about it a while ago. What should they actually have done differently?

One of the issues was that "she could not share her customer list due to privacy concerns". So maybe JPM could have pushed back against that more?

"""Javice also cited privacy concerns in sharing Frank’s customer data directly with JPMC. After numerous internal conversations, and in order to allay Javice’s concerns, JPMC agreed to use a third-party data management vendor, Acxiom, to validate Frank’s customer information rather than providing the personal identifying information directly to JPMC."""


I was involved in some diligence when a prior company was considering an acquisition. The numbers they claimed vs the numbers we could trust from their various SaaSes were pretty fishy. It was a small deal - more like $1M. We didn't pursue them, they don't exist any longer.

The gap here was _huge_. If I was the JPM diligence team, I might have asked them for read-only access to their product analytics. They claimed something like 10K FAFSA applications/day. This should show up nicely in their analytics tools. Yes, they could fake these visits--but it would be much harder to fake that you're getting 10K visits from appropriate regions, at appropriate times of day, with appropriate dwell times, with appropriate distribution of completion rates.


In most jurisdictions it would generally be possible for the seller to hire outside counsel to validate customer metrics claims under attorney-client privilege without violating consumer privacy laws or customer agreements. The outside attorney could then provide a letter to the buyer attesting to what they found without revealing any specifics about individuals. Of course that would delay the deal, and the buyer here seems to have been irrationally eager to close the acquisition.


> What should they actually have done differently?

Credit card processor revenue reports over prior months may have shown startingly small revenue. Match that with bank statements.


You pay extra to hide the crime:

""" After the August 3, 2021 Zoom meeting, the Data Science Professor returned a signed version of Frank’s NDA. The Data Science Professor’s usual hourly rate was $300. Javice unilaterally doubled the Data Science Professor’s rate to $600.

[...]

Specifically, on August 5, 2021 at 11:05 a.m., the Data Science Professor provided Javice an invoice for $13,300, documenting 22.17 hours of work over just three days. The invoice entries show that the bulk of his time was spent on the main task that Javice retained the Data Science Professor to perform – making up customer data. The Data Science Professor’s invoice indicated that he performed “college major generation” and “generation of all features except for the financials” while creating “first names, last names, emails, phone numbers” and “looking into whitepages.”

In response to the initial invoice, Javice demanded that he remove all the details admitting to how they had created fake customers – and added a $4,700 bonus. In an email to the Data Science Professor at 12:39 p.m. on August 5, 2021, Javice wrote: “send the invoice back at $18k and just one line item for data analysis.” In total, Javice paid the Data Science Professor over $800 per hour for his work creating the Fake Customer List, which is 270% of his usual hourly rate.

The Data Science Professor provided Javice the revised invoice via email seven minutes later at 12:46 p.m., commenting “Wow. Thank you. Here is the new invoice.” """

https://assets.bwbx.io/documents/users/iqjWHBFdfxIU/rNlNVTl....


I wonder why the Data Science Professor isn't named/charged as an accomplice. Maybe they are acting as a witness for the prosecution?


it sounds like his initial invoice was quite clear in the work completed, then updated at the client's request. So while you can argue moral grounds for not doing this work, I don't think there's illegality, i.e. conspiracy.


I mean if you are a professor and knowledgeable in how the startup uses the data, it’s hardly justifiable that “oh crap i didn’t know they were using it for illegal purposes”.

They were totally complicit allegedly.


This is spoken to [in the full complaint][1]. The data scientist was told Frank really did have 4 million users, and the scientist only needed to generate this "synthetic data" as a way to "anonymize" their "real" data. I.e. the scientist was duped:

  JAVICE told Scientist-1 [...] that she had a database of approximately 4 million
  people and wanted to create a database of anonymized data that mirrored the
  statistical properties of the original database (the “Synthetic Data Set”).
  
  [After JAVICE sends Scientist-1 the data], Scientist-1 understood that the data
  available via the Access Link Email -
  **a data set of approximately 142,000 people** (emphasis added) -
  was a random sample of a larger database which contained data for approximately
  4 million people. In fact, that data represented every Frank user who had at
  least started a FAFSA.
[1]: https://www.justice.gov/usao-sdny/press-release/file/1577861...


Plausible deniability is my non professional guess


There is a post about a 38 year old becoming CEO in 2012 https://www.mortonsalt.com/article/morton-salt-announces-new...

But he claims to have remained CEO until 2019 https://www.linkedin.com/in/christian-herrmann-cfa/


This happened in the late seventies. Nowadays you'd have a hard (though not impossible) time doing it because of federal laws against ageism. However the tech industry seems to get away with it in hiring.


No idea. I found the question interesting and looked into Enron a bit. No idea how comparable it is.

December 2, 2001 – Enron files for bankruptcy

February 19, 2004 – Jeffrey Skilling surrenders to the FBI, released on $5 million bond

January 30, 2006 – Trial starts

May 25, 2006 – Jury returns verdict

October 23, 2006 – Skilling sentenced

December 13, 2006 – Request to remain free during appeal denied, prison sentence begins


I would expect it to take less time than Skilling for three main reasons.

Enrons crimes were more legally dubious because they had followed accounting rules in a convoluted way. The end result wasn't legal, but the individual steps in isolation were arguable. So prosecutors had a daunting task. FTX seems to have flagrantly flaunted not just laws but even basic accounting.

Enron was a big hierarchy. Kind of like a mob trial, they got each lower level to turn on the next level. They had to prove the employees doing the actions weren't just rogues. Here, it's straight to SBF.

Skilling was smart enough to hide behind a lawyer and deny everything. SBF seems to be admitting to various aspects of the crime in conversations with journalists.

An added wrinkle is the Bahamas and the US (at least) want to try him. It might take time to figure out which tried him first. But I expect that's a fairly quick process.


> SBF seems to be admitting to various aspects of the crime in conversations with journalists.

In one Twitter interview he seems to argue that his company did the same thing you describe about Enron: Steps that seemed fine in isolation but got out of hand.


If you are talking about his Vox interview over Twitter, he claimed every step was rational (he did not claim legal) in isolation.

But the details look different. He admits that FTX said they never invest the deposits. He then claims he loaned that money to Alameda who invested the deposits. That seems like an important pair of details. Those two things help combine with other facts to build the legal case. He also admits he didn't really lend them out to Alameda, probably because it evaded the margin limits.

Look, he's not running around saying "I ran a giant fraud". He may even think he did nothing wrong. But he is agreeing to some of the pieces prosecutors need to prove. Given FTX's bad records, just confirming basic facts is good for the prosecution.


Yeah, expect it to take 2-10 years


Desktop bandwidth is improving over time, but as I understand HTTP Archive is still using a 5 Mbps cable connection.

From their FAQ/changelog [1]:

> 19 Mar 2013: The default connection speed was increased from DSL (1.5 mbps) to Cable (5.0 mbps). This only affects IE (not iPhone).

There was another popular article on HN a while ago [2], claiming mobile websites had gotten slower since 2011. But actually HTTP Archive just started using a slower mobile connection in 2013. I wrote more about that issue with the HTTP Archive data at the time [3].

[1] https://httparchive.org/faq [2] https://www.nngroup.com/articles/the-need-for-speed/ [3] https://www.debugbear.com/blog/is-the-web-getting-slower


Regarding "4 seconds wasted" per visit: HTTP Archive also publishes real-user performance data from Google, and only 10% of desktop websites take 4 seconds or more to load. (And I think that's not the average experience but the 75th percentile.) https://httparchive.org/reports/chrome-ux-report#cruxSlowLcp

The Google data uses Largest Contentful Paint instead of Speed Index, but the two metrics ultimately try to measure the same thing. Both have pros and cons. Speed Index goes up if there are ongoing animations (e.g sliders). LCP only looks at the single largest content element.

When looking at the real-user LCP data over time, keep in mind that changes are often due to changes in the LCP definition (e.g opacity 0 elements used to count but don't any more). https://chromium.googlesource.com/chromium/src/+/master/docs...


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: