At most. I think they should use AI to only correct grammar and punctuation. In no other situation do I see AI having the capacity to write its very own news article. It lacks the insight and...
At most. I think they should use AI to only correct grammar and punctuation. In no other situation do I see AI having the capacity to write its very own news article. It lacks the insight and knowledge of a human being. It's just a string of probability, coupling words based on the percentage of how they were used by real people according to their scraped data during LLM training.
I found AI is good at inflating and deflating (i.e. summarizing) text. Meanwhile, humans seek deflated text, but get uppity if they receive it, acting like it must be inflated, or else it's not...
I found AI is good at inflating and deflating (i.e. summarizing) text. Meanwhile, humans seek deflated text, but get uppity if they receive it, acting like it must be inflated, or else it's not "professional." As if the other person's wasteful and frustrating act of artificially inflating text was an expression of respect.
It feels like a match made in heaven. Or hell, depending on perspective.
But, yeah, do research (if using AI, watch out for hallucinations, naturally), write bullet points, ask AI to inflate, cut out the new hallucinations, polish. From what I've experimented with it, I think it basically takes as long as writing it from scratch, actually, but the result is better edited.
The journalists are correct, this is incredibly disrespectful. Even if AI was perfect, and created an article with absolutely no mistakes, from what I've gathered from this article, communication...
The journalists are correct, this is incredibly disrespectful. Even if AI was perfect, and created an article with absolutely no mistakes, from what I've gathered from this article, communication was lacking and this can be perceived as an attempt by Gizmodo to "retire" or "obsolesce" their writers.
Furthermore, I see the potential for AI to work in data analysis to go through piles of info and retrieve meaningful data for a report, but this is far from that. This is nothing but a glorified Google search result summary. If anything, I draw the line of "acceptable" at AI itself writing the articles with little oversight or transparency (and even with full transparency, I'd still shun it without a number of "human" steps). I would much rather have it compile the information, and then have a journalist flesh out the article manually.
At most. I think they should use AI to only correct grammar and punctuation. In no other situation do I see AI having the capacity to write its very own news article. It lacks the insight and knowledge of a human being. It's just a string of probability, coupling words based on the percentage of how they were used by real people according to their scraped data during LLM training.
I found AI is good at inflating and deflating (i.e. summarizing) text. Meanwhile, humans seek deflated text, but get uppity if they receive it, acting like it must be inflated, or else it's not "professional." As if the other person's wasteful and frustrating act of artificially inflating text was an expression of respect.
It feels like a match made in heaven. Or hell, depending on perspective.
But, yeah, do research (if using AI, watch out for hallucinations, naturally), write bullet points, ask AI to inflate, cut out the new hallucinations, polish. From what I've experimented with it, I think it basically takes as long as writing it from scratch, actually, but the result is better edited.
The journalists are correct, this is incredibly disrespectful. Even if AI was perfect, and created an article with absolutely no mistakes, from what I've gathered from this article, communication was lacking and this can be perceived as an attempt by Gizmodo to "retire" or "obsolesce" their writers.
Furthermore, I see the potential for AI to work in data analysis to go through piles of info and retrieve meaningful data for a report, but this is far from that. This is nothing but a glorified Google search result summary. If anything, I draw the line of "acceptable" at AI itself writing the articles with little oversight or transparency (and even with full transparency, I'd still shun it without a number of "human" steps). I would much rather have it compile the information, and then have a journalist flesh out the article manually.