I do think ChatGPT can be a useful tool in many ways, but every time I see its use, I can't help but think about the ethical concerns. First is that putting in a prompt versus running a non-AI search takes 30 times more energy to receive an answer, not to mention the amount of water required to cool the supercenters that power these operations. Scientific American has a great article on those: https://www.scientificamerican.com/article/what-do-googles-ai-answers-cost-the-environment/
As a fiction author, I am well aware of the cost to creatives, especially authors and artists, who have received no compensation for their work being pirated to build the databases that AI runs on. Many copyright cases are in progress right now, and so far there's been some rulings on the side that it's a copyright violation and some on the side that it's fair use. But as tech companies are now licensing materials to use directly with publishing houses, whereas they simply used those materials free of charge before, I think it's clear that they recognize that they should be compensating others for the work they are profiting off of. But they aren't going to acknowledge that they should have done that in the first place unless we force them to.
I'm also concerned as to how many people are losing the ability to do their own research and don't think it's necessary as long as the results that AI generates sound convincing enough. We are becoming accepting of lies as fact as long as they read well and save us time not to verify them. I've come across resentment among folks when pointed out that they are sharing false news generated by AI. They'd rather not know.
It'll be interesting to see what we all feel about whether the usefulness outweighs the ethical costs in a few more years.
I agree wholeheartedly with the value of struggling with the text ourselves, doing the work makes the sermon resonant because it becomes part of who we are. Using tools for research and ideas is definitely worthwhile, and there's no reason AI or ChatGTP can't be among them. But another side of this issue is that when a sermon is written by ChatGTP, whose words and ideas are showing up there? Without attribution, someone's work lies behind what shows up on the screen -- their ideas, their work, even their words. Yet they will not be cited or acknowledged.
I think this is a very important point. When I read ChatGPT's response, I wonder whose sermons, and work, it ingested without their permission and without compensation to string together a pleasing regurgitation of those same thoughts and ideas minus the credit for the original authors that it comes from.
Can AI write a novel? I think it can. But we creators would do well not to cross that line. Like you, Ken, I find it’s a great research tool and a good editor. That’s where I leave it.
I do think ChatGPT can be a useful tool in many ways, but every time I see its use, I can't help but think about the ethical concerns. First is that putting in a prompt versus running a non-AI search takes 30 times more energy to receive an answer, not to mention the amount of water required to cool the supercenters that power these operations. Scientific American has a great article on those: https://www.scientificamerican.com/article/what-do-googles-ai-answers-cost-the-environment/
As a fiction author, I am well aware of the cost to creatives, especially authors and artists, who have received no compensation for their work being pirated to build the databases that AI runs on. Many copyright cases are in progress right now, and so far there's been some rulings on the side that it's a copyright violation and some on the side that it's fair use. But as tech companies are now licensing materials to use directly with publishing houses, whereas they simply used those materials free of charge before, I think it's clear that they recognize that they should be compensating others for the work they are profiting off of. But they aren't going to acknowledge that they should have done that in the first place unless we force them to.
I'm also concerned as to how many people are losing the ability to do their own research and don't think it's necessary as long as the results that AI generates sound convincing enough. We are becoming accepting of lies as fact as long as they read well and save us time not to verify them. I've come across resentment among folks when pointed out that they are sharing false news generated by AI. They'd rather not know.
It'll be interesting to see what we all feel about whether the usefulness outweighs the ethical costs in a few more years.
What’s kind of unsettling is… the voice is not unlike yours. Lots of food for thought here.
I agree wholeheartedly with the value of struggling with the text ourselves, doing the work makes the sermon resonant because it becomes part of who we are. Using tools for research and ideas is definitely worthwhile, and there's no reason AI or ChatGTP can't be among them. But another side of this issue is that when a sermon is written by ChatGTP, whose words and ideas are showing up there? Without attribution, someone's work lies behind what shows up on the screen -- their ideas, their work, even their words. Yet they will not be cited or acknowledged.
I think this is a very important point. When I read ChatGPT's response, I wonder whose sermons, and work, it ingested without their permission and without compensation to string together a pleasing regurgitation of those same thoughts and ideas minus the credit for the original authors that it comes from.
Can AI write a novel? I think it can. But we creators would do well not to cross that line. Like you, Ken, I find it’s a great research tool and a good editor. That’s where I leave it.