(#121)actually thats called hallucinations, when it doesn't know something it just spews out random stuff. the openai ones just kinda… try something else and ask you if thats correct. Llama is a bit more honest, clearly saying “i actually don't know what this is, could you clarify?”not give out code like that… The OP claims that you won't need to train it, but that's not true- you do. You need to train it to behave the way you want it to - to not dish out code, to actually explain the blocks and give out helpful hints when needed.The ST has a data center. And we need to train the AI to
Edit: AI uses data centres. Does the Scratch Foundation have money to build one?
And it won
T spew out anything it ain’t trained on.
however if you don't question it and blindly follow it, for example i asked it who made turbowarp and it spewed some random programmer who died like 14 years ago. it then said it didn't know what turbowarp was. I think we can fix that with the prompt