Hacker Newsnew | past | comments | ask | show | jobs | submit | donpark's commentslogin

It’s just a technology that can be used for both civilian and military applications, not only by private entities.

if you don't have a contract with the utility you almost certainly violate the law, at least in Europe, but then again I don't know what the US regulation is

Decay and losses are just two of the many constraints that could be mixed like paints to create new systems.

Limitations, artificial or not, are not always bad. Walls, for example, can be seen as limitation or protection depending on how it's used.


Great work!

Would love to see MP4 Hybrid supported in popular packages like mp4-muxer [1] and mp4box [2] someday.

1: https://github.com/Vanilagy/mp4-muxer 2: https://github.com/gpac/mp4box.js


It's because AI is useful enough despite its current limitations.

Developers work with what we have on the table, not what we may have years later.



That's a contract between users and HN. Airtrain is a 3rd-party.

If HN API exposes personal information publicly through their API then there is a problem.

And AFAICT the only way for HN to prevent user comments from being used by 3rd-party is preventing access to those comments, meaning a) sign-up will have to be more stringent and b) visitors will have to sign-in just to read (or scrape) comments.


And that doesn't really prevent anything, it just (mildly) slows it down.


KAIST is a top-tier South Korea university focused on science & engineering.


Pricing doesn't look right, particularly the monthly subscription model.

What is the target market? Is there a hidden market where people need to create logos often enough to justify the subscription?


Data selection depends the use-case. Two contrasting use-cases I see are:

- Emulation

- Advisor

In case of MTG player emulation for example, I think it makes sense to group data by some rankable criteria like winrate to train rank-specific models that can mimic players of each rank.


Leaking original data would expose the company to direct copyright violation lawsuits. Changing T&S is simplest way to stave the legal risk exposure, buying time to implement technical remedies.

As ridiculous as it may seem, they're doing the right thing.


I always find it amusing when criminals threaten legal action. Happens all the time. They steal your property then they cower behind their legal rights.


The right thing would've been to not train the model on data they do not own, or that they do not have permission to use.


Well I'm glad they did the wrong thing then


Are you also glad that microsoft - a behemoth - is profiting from your hard work by training their co-pilot using your code in github ?

Will you be glad when those systems are good enough to replace you, and they became so using your toil, for free ?


I've had no problem using Microsoft's toil for free by downloading free windows ISOs all my life, so if they want to pirate my Github code it's not bad enough to care about. Besides the bad practices the model might internalize as a result that is


Then speak for yourself, because I have been using Linux as a daily driver for almost my whole life.

I do not want corporate behemoths to profit from my work for free. Period.


Good for you, don't post your code on GitHub then, as they have an express terms of service about being able to use code submitted for business purposes, including AI model training.


I do not. Not any more. Which is not the GitHub I first joined.

And to sprinkle in a bit of ad hominem, I am aware that things regarding rules or ethics are viewed differently in your culture, and that is okay.


I am not sure what you mean by "my culture," what are you basing that off of? Why would you knowingly and willingly add an ad hominem? Very strange behavior in discourse.

GitHub has always had such clauses, even if they were not explicit about AI model training in particular. It is best to self host your own git instance if you are so worried.


Be careful saying that kind of stuff here. People get really mad when you tell them their new toys aren't ethical.


I made it my mission to get the lot of them mad. There are plenty of legitimate ai companies out there but YC seems fond of those unethical, which explains the infusion of ip stealing startups on here and their simps.


Hell yeah.


It will be interesting to see how it plays out. I can imagine Wiley, McGraw Hill, Pearson and other publishers[0] of educational content OpenAI used could sell the rights to their material to be used for training GPT, but the price would be high enough we would be paying $100/month instead of $20.

[0] Heck, they could even unite and found an LLM startup themselves training the models legally and making it available for users at various tiers.


“don’t touch the unsecured, loaded firearm that is sitting on the counter, that might be stolen, maybe even got a body on it, don’t look too close, or you can be kicked out of the club for not following the rules”

so if this is what the right thing looks like…


> As ridiculous as it may seem, they're doing the right thing.

Making it against the rules to be able to prove their illegal behavior is not the right thing to do.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: