Hacker News new | past | comments | ask | show | jobs | submit login

> Is it too late to create a popular programming language after age 40? > > ... > > Average age: 37.5

So that would be a clear "no".


I live in a coastal region that has hurricanes, ice storms and tornadoes. Gas stoves work during these storms, electric stoves often don't.

If you force me to use an electric stove then I will buy a wood-fired stove and use it. I will burn the (free) wood from trees downed by the storms and the (free) wood from homes downed by tornadoes. You will eat my pine resin smoke the remaining days of your life and I will give you my stove when you pry it from my cold, dead hands.


You either pay for things up front with taxes, or you pay for them later by way of inflation. There's no free lunch.

The government has chosen inflation. Really at this point they'd save money by shutting down the IRS, dismantling the entire tax system and just pay for things by printing what they need. Inflation is the ultimate flat tax.


they lose on the unit item, but make it up on the volume!

Think about what things "cannot be doubted", with all the brain-in-a-vat types of caveats. It's not trying to be a scientific definition. It operates earlier on the epistemological ladder than science can be meaningfully applied, and that might well be the only reasonable place to define consciousness. (I still can't call it a great definition, even if it did perfectly correspond with the concept. Too indirect.)

The obviously reply to the question is: "Probably not!": see Second Act: What Late Bloomers Can Tell You About Success and Reinventing Your Life https://www.amazon.com/Second-Act-Bloomers-Success-Reinventi...

That acquisition is pretty light on proof:

> No astronauts or fellow trainees have verified Dwight's account of Yeager's treatment


Style is hard to objectify. It’s a lot easier to determine whether a 3D model depicts Super Mario than whether it’s in the “modern Nintendo” style.

Style is also very broad. It’s even harder to determine whether an 8-bit NES sprite is in the “old Nintendo” style, because 8-bit sprites don’t have much flexibility to distinguish themselves.

Broadness: imagine if whoever first came up with the “low-poly 3D” or “flat material” or “voxel” aesthetics could copyright them and prevent anyone else from selling anything in those styles. What defines a style as narrow enough that it can be copyrighted? And what if that definition changes, e.g. if a a specific voxel style gets copyrighted, then someone else discovers a brand new way to render voxels super efficiently in only that style?

Objectivity and similarity: an artist can make a concrete object or character which is very similar to a copyrighted one but also clearly distinct. This is very important, because if “similar” objects could violate copyright, where is the line when something is dissimilar enough? Ultimately it would be very far for small artists, who can’t afford to risk lawsuits; vast swaths of clearly not similar characters and objects would be blocked off from them, because in the eyes of the law and without good representation, they’re no longer “clearly” not similar. In fact, it may be hard for an artist to even come up with an object or character that doesn’t risk a copyright lawsuit, since there are more copyrights that anyone could fully know. (At least to my knowledge, with copyrightable characters and objects this hasn’t been a frequent issue; but if it is, copyrightable style will make it worse, so for the sake of the argument...)

Copyrighting style is basically copyrighting the “similar” works. There’s a fine enough line between whether a character or object is “similar to” or “the same as” another (again to the best of my knowledge). But there’s no fine line with style. If one tries to define a style with objective criteria like making their “style” a specific stroke thickness and color scheme, generative AI users will just create art which falls right outside of this criteria. If one tries to use an AI classifier (ironically) to deduce whether something is “the same” or “similar but distinct”, it will be foiled by AI-adversarial manipulation and its effectiveness will be endlessly disputed in court. And if one defines their style with very subjective judgements, that leads to the issue above.


What isn't prominently mentioned in the article is that endurance and retention are highly related --- flash cells wear out by becoming leakier with each cycle, and so the more cycles one goes through, the faster it'll lose its charge. The fact that SLC only requires distinguishing between two states instead of 16 for QLC means that the drive will also hold data for (much) longer in SLC mode for the same number of cycles.

In other words, this mod doesn't only mean you get extreme endurance, but retention. This is usually specified by manufacturers as N years after M cycles; early SLC was rated for 10 years after 100K cycles, but this QLC might be 1 year after 900 cycles, or 1 year after 60K cycles in SLC mode; if you don't actually cycle the blocks that much, the retention will be much higher.

I'm not sure if the firmware will still use the stronger ECC that's required for QLC vs. SLC even for SLC mode blocks, but if it does, that will also add to the reliability.


I do think that if we'd make Shuttle today, it would be much more reliable than it was. Successes of vertical Falcon-9 landings notwithstanding.

The bandwidth costs are the key. Good luck getting rates anywhere near what Google’s effectively are. Spoiler: you can’t. You probably can’t realistically get to 5x their costs, byte-for-byte.

Which makes competing with them effectively impossible except for a very-few other megacorps.


> programmers typically peak in their 30s

Should programmers expect to get worse at their job in their late 30s or as they enter their 40s? This seems like a surprising conclusion.


Not OP, but thank you for the detailed feedback. I've been considering various options, including Swing, for a small desktop tool.

May I ask how are the apps made with Electron, Tauri, Wails or other WebView-based GUI toolkits perform on macOS in these regards?


Counterpoint: I don’t have a neckbeard and Myst and Riven are the best single player games I’ve ever played. Period.

Nothing wrong with minimum wage and OSHA. When the government creates a level playing field, capitalism can flourish.

When companies stop competing with each other, and start trying to suck as much money out of the government that they can - that's when things fall apart.


Mostly agree. Winged design can be much gentler for passing the atmosphere on the way back comparing to capsules and can probably even get passively to landing speed, the drawback being the requirements of the landing strip. They are also better to control, both with choosing the place to land and even with ability of having atmospheric engines, which would get them close to aircrafts in handling landing.

Oh, I'm sure there are plenty of hurdles, I was just wondering if the latency problem could by solved by having the computers in orbit.

I shoot RAW from an older Canon 5D which Resolve does not read natively. So there's a bit of a conversion step going from CR2. My typical workflow is to use Adobe RAW to process the images, then import the RAW directly to AE to render out with whatever repositioning or cropping. Let's not forget LR Timelapse[0] as part of the workflow too.

[0] https://lrtimelapse.com/


Counterpoint : I don’t have a neckbeard and Myst and Riven are the best games I’ve ever played. Period.

Exactly right.

The money space programs spend is but a drop in the ocean compared to military budgets.

We can - and should - do more than one thing with government budgets.


Too poor to fix their shift key

I hadn't encountered conscious agent theory before. I took a quick look and it seemed to be solipsism wearing a disguise. Can you elaborate how it distinguishes itself from solipsism in its arguments that it might be real?

I found the evolutionary argument rather odd. The disconnect between perception and reality is pretty much the standard belief these days. Unless I'm reading it wrong it was making the claim that 'reality' is a non causal artifact of conscious entities but one that was caused by evolution, which seems contradictory.


>I don't think AI is going to send out nukes, I do believe it's going to replace every single person's job and make the powerful even more powerful.

The shitty part being: it won't replace people's jobs at equivalent skill and quality levels. It'll just make the output and consumer experience much, much shittier, because for some weird-ass reason a lot of these executive types seem to literally believe that minimizing labor costs is more important than providing value for the customer.


That link doesn't make a claim quite that strong. I also don't know anyone that has eaten it.

Given that I know dozens of people who demonstrably lost their sensitivity to poison oak via the accidental chronic exposure regimen I outlined above, at the very least it should raise a scientific question. It would be easier to dismiss if it was an isolated case or two. No one exposes themselves like that intentionally.


> Nobody blames Google when the answers are wrong

Don't they?

https://en.wikipedia.org/wiki/Criticism_of_Google#Possible_m...


I would imagine that's too insignificant to factor into that particular calculation.

That can't possibly be true.

East Asian countries have a long tradition of lacquerware, which is made with urushiol-containing saps. https://en.wikipedia.org/wiki/Lacquerware

In fact urushi is the Japanese word for lacquer, the plant is in the genus Toxicodendron.

Like most jobs until recently, making lacquerware was hereditary, and (clearly) the people making it were able to withstand sustained and direct exposure. It's possible that there is a genetic proclivity involved in ability to do the work, but just as clearly, there is hyposensitivity gained in exposure.

Let me back that up with a citation. https://pubmed.ncbi.nlm.nih.gov/1839723/


No, we don't. We don't even know if existence has an extent in time, because our only way of "interfacing" with time is our experience of memory we can't prove is real.

For what you know, you're a lone entity confined to an infinitely short period of time, and all else is an illusion.

But of course this isn't a useful assumption in most respects.


What I'm saying is that they will have to be as soon as they accept an external contribution. The code they accept will be licensed under the AGPL by the contributor. A hosted-only version including both the contribution and hosted-only features would be a derived work, and thus require source disclosure under the terms of the AGPL.

To be fair, Catania has more fabs than any other city in Italy (2 with another fab in construction), also being home to the largest stmicroelectronics establishment in Italy, so I guess it kinda is Italy's silicon valley

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: