GPWA Times Magazine - Issue 58 - February 2024

#13 NoDepositCasinos 11 December 2023, 10:50 p.m. Public Member How much do you consider “too much AI”? AI testers are not completely reliable and generate many false positives (like everything else, I suppose they will improve over time), and a lower AI content percentage increases the likelihood of errors. To answer your question, I do not verify that every article from the writers I work with is free from AI-generated content. I only check if something seems off. From my perspective, if I use AI tools, I don’t see why to criticize someone else who does the same as long as the work meets the quality criteria. I am paying for a finished article with certain characteristics, and (in my case) the method the person used to create it is not one of them. “ End of Thread #9 Mattbar 8 December 2023, 4:31 a.m. Private Member I check for AI content, but the tools are not great and there are false positives. Many of my writers have been working for me for 5-10 years and they very much know what I want and they deliver. It wouldn’t be worth their while to copy AI in content, but, of course, I don’t mind them using it for research as part of many research tools. That said I am checking content to be on the safe side but I would not accuse a writer before seeing several positive results and reading the text myself comparing with older work. I would need to be absolutely certain before accusing a writer that has worked for us for 5 years of using AI content. To date I have not yet had an issue with this. Another point is we actively try to produce content that AI couldn’t or would really struggle with. AI is getting better all the time, so it is a constantly changing landscape. But, for now, I am happy that the regular writers we use are all above board – so, yes, I trust the writers. “ Reply With Quote #10 Giorgos Manousos 8 December 2023, 6:53 a.m. Public Member I try to write the text for our project myself, but sometimes I give this mission to other writers because of workload. After receiving their works, I check the text for AI content percentage. Most often it is minimal, but even if the figure is 30% - 50% I can still accept the text if it is well written and fully discloses the topic. As for Originality.ai, sometimes I check articles that I write myself through it. Sometimes it shows 20% - 30% AI in my texts, although I didn’t use them. I think this is because AI learns from such texts that experts write themselves and tries to repeat it. I wouldn’t say that using AI is very bad, but if you give a task to a person who just generates text with AI and gives it to you and you pay for it, it’s a bad situation. Then you should change the rate or do the generation yourself (it will save your budget). “ Reply With Quote #12 Randy Ray 8 December 2023, 11:43 a.m. Public Member I haven’t found an AI tool that recognizes human-written content versus AI-written content with any kind of reliability. That shouldn’t be the bar, anyway. I’ve met plenty of human “writers” who couldn’t write as well as ChatGPT. But I generally want stuff better than ChatGPT produces, anyway. It’s not hard to spot AI-written content after spending a little time with these tools and recognizing how these robots write stuff. “ Reply With Quote #11 DanHorvat 8 December 2023, 8:45 a.m. Private Member I suppose people who are worried about AI shouldn’t use AI to check for AI, but instead get a human to have a look at it. It doesn’t matter if the copy was written by a human who can’t write, or by AI. If it looks “meh” then it’s useless either way. If you’re reading a good casino review or a good slot review, you’ll know it. I’d say this situation is forcing content writers to either up their game, or go flip burgers. “ Reply With Quote 17 GPWAtimes.org

RkJQdWJsaXNoZXIy NDIzMTA=