"The dead and wounded lay on the ground for a long time," he said.
"Leaders may fall, but the anger remains. Wherever injustice exists, there will be movements. We may not call them Maoism anymore - but they'll be there.""I've got this encouraging external voice going – 'right - what are we going to do [today]?' Like an imaginary friend, essentially."
For months, Kelly spent up to three hours a day speaking to online "chatbots" created using artificial intelligence (AI), exchanging hundreds of messages.At the time, Kelly was on a waiting list for traditional NHS talking therapy to discuss issues with anxiety, low self-esteem and a relationship breakdown.She says interacting with chatbots on character.ai got her through a really dark period, as they gave her coping strategies and were available for 24 hours a day.
"I'm not from an openly emotional family - if you had a problem, you just got on with it."The fact that this is not a real person is so much easier to handle."
People around the world have shared their private thoughts and experiences with AI chatbots, even though they are widely acknowledged as inferior to seeking professional advice. Character.ai itself tells its users: "This is an AI chatbot and not a real person. Treat everything it says as fiction. What is said should not be relied upon as fact or advice."
But in extreme examples chatbots have been accused of giving harmful advice.She was spotted by two men in a boat in Poole Harbour and they rescued her before alerting Lost Dog Recovery, which had helped the search for her.
Jess Wadsworth, who had been looking after Amber before she scarpered, said Amber's disappearance had left her family anxious for weeks."I still can't quite believe [Amber is back]," she said.
"We've lost a pet dog for two or three days. You think in those terms. I have never known a dog that travels that many miles and survives."She's already in really good nick compared to how long she had gone for. What a girl."