-
• #277
US military AI drone simulation kills operator before being told it is bad, then takes out control tower
-
• #278
News thread has guardian link which explains different. Drone took out control tower so it could ignore instruction not to kill operator as operator kept telling it not to kill things thus denying it 'points' it was programmed to accumulate.
Bonkers either way.
-
• #279
Governance of superintelligence
Creators of BigAI™ want BigAI™ to be regulated.
And not just because they want to create barriers to market entry and create a monopoly to maximise their profits, oh no, because that would be disingenuous to a fucking fault.
-
• #280
London's walking and cycling commissioner appears to have been duped...
-
• #281
The "Seinfeld" stream was pretty good, but went dire. And now a couple of months of AI development has given us a Trump vs Biden debate, and I'm down - https://www.twitch.tv/trumporbiden2024
-
• #282
Trump speaking in whole sentences and focusing on a single topic is scary.
-
• #283
This is interesting on the problem of AI's learning from AI generated content.
-
• #284
Just as we’ve strewn the oceans with plastic trash and filled the atmosphere with carbon dioxide, so we’re about to fill the Internet with blah. This will make it harder to train newer models by scraping the web, giving an advantage to firms which already did that
Feedback loops won't fuck it for the rest of us - monopolistic, self-serving & selfish corporate interest will fuck it for the rest of us.
-
• #285
Feels like I helped teach bing AI the other day. My limited knowledge / poor prompts probably didn’t help first off but eventually it found the right answer after I made it clearer...
1 Attachment
-
• #286
https://www.bbc.co.uk/news/technology-65881389.amp
Discrimination worries being cited
-
• #287
Lawyer in New York used ChatGPT to help research case law and ended up presenting six cases that did not exist that ChatGPT had created to the court.
ChatGPT also lied when queried if the cases existed, insisting they were included in various legal text books. -
• #288
I went to an AI conference last week specifically concerning the music business and the difficulties the current copyright laws will have in flexing to the new world order. Summery can be found here
I quite liked the quote "AI won't replace your job. A person using AI will."
There seems to be strong feeling to moving away from the phrase 'AI' and towards 'machine learning' as so many of the tools that are now emerging are very task specific.
-
• #289
-
• #290
moving away from the phrase 'AI' and towards 'machine learning' as so many of the tools that are now emerging are very task specific
This. I'm no expert but I am pretty convinced by the argument that we've all slipped into calling things 'AI' that are actually just deep machine learning (or narrow/weak AI versus general/strong AI or whatever terms we want to use), perhaps because we dimly remember something about the Turing test, which is coming up for 75 years old.
-
• #291
Isn't this (and I'm definitely no authority on the subject) the difference between 'generative AI' and 'general AI' (which doesn't really exist yet)? They sound pretty similar and are both clumsy terms which doesn't help.
-
• #292
.
-
• #293
.
-
• #294
This is what AI does best..
https://twitter.com/mrkphllps1/status/1707522888562065911?t=z7T5u_RvoEPjmr0EPBvw-w&s=19
-
• #295
I'm also no authority but I believe 'generative' literally just means an 'AI' that is able to not just process and analyze data/content but can also generate new data/content (like the responses that ChatGPT has to prompts, or the difference between being able to identify images of cats and being able to create an image of a cat).
Whereas the narrow/general differentiation is, for example, the difference between an AI that can identify a cat 90% of the time and one that, when it's told it got its identification wrong can formulate and pursue its own next steps to improve its identification process rather than just relying on a human programmer to feed it more/different training data.
-
• #296
Yes, this is what 'AI' is currently really good at, but it doesn't mean current 'AI' isn't just a VERY advanced text autocomplete. It's just that a very advanced text autocomplete that has scanned the entire internet for data will do a really good job at answering questions that have been asked before. The AI didn't 'know' the answer to your super technical question, it had just scanned the small (but non-zero) existing answers to that question.
I did the same and asked ChatGPT to answer a 'what would you do in your first month in your new job' interview question and its answer was perfectly passable (albeit it started by congratulating me on my new job), but sites like Glassdoor will be a rich source of example questions and answers like this. It doesn't mean ChatGPT could do my job, or yours.
-
• #297
generate new data/content
I think it's important to note that is is not generating new data, per se*.
It is the re-ordering and re-presenting of existing data, that has been shredded, contextualised and tagged along various axes, into a new context.
* Although this does bog us down in semantic & probably philosophical arguments as to what "data" means in this context.
-
• #298
It is the re-ordering and re-presenting of existing data, that has been shredded, contextualised and tagged along various axes, into a new context.
Pretty sure that's what my brain does.
-
• #299
.
-
• #300
Absolutely - 'new' only in the sense of newly organised, ordered, contextualised or presented, not new in the sense of genuinely 'original', although as you say, beyond this (pretty clear, I think) distinction you can choose whether you get into a semantic or philosophical swamp or both...
I fear we may come to regret having AI wiggle it's butt for us.