Cash Still King?
The slang term, “cash is king” is often used to reflect the belief that money (cash) is more valuable than any other form of investment tools, such as stocks or bonds.
In fact, when l was in Graduate School in the 90s, we often used the slang when analysing financial statements of companies to determine how solvent or exposed they were in the marketplace.
Advertisement
For instance, you may have a situation where a company is asset-rich, but cash-poor, and we would say, cash is king! What this means is that the company might have made a lot of investments in assets that are not generating enough cash flow to support, for example, operational activities.
That way, the company could have liquidity problems and end up in solvency challenges. I remember a situation where our lecturer exposed us to another dominant role of cash when expressing an opinion in financial analysis. He explained that cash is fact, but profit is opinion!
What he meant was that you may have credit sales contributing to the overall profit of an organisation so when unexpectedly bad debts above the provision already made arise, the profit situation suffers some impairment. But not so with cash—when you have the cash in hand, it is the fact that you have the cash. Yeah!
Well, my interest this week is not to start another MBA 101 class but rather to explain why cash remaining king in the literal sense can and has brought about some of the evils that we experience in this world.
To make the cash some unscrupulous businessmen put profit above people and care for the planet. Let me explain further. If I were to pose a question about why cash is king in people’s lives, I wouldn’t be short of answers at all.
We are all aware that with money in your pocket, you are always handsome and sing beautifully too, regardless! It is expected that people will love money; although money cannot solve all problems, it can assuage the pain of some of the common daily battles that we face—finding food, clothing and shelter.
Advertisement
Money cannot be the root of evil if in our pursuit of it we avoid all the evil on the way. But hold on a minute! What happens when everything is given commodity status and given a price?
Then it becomes tradeable, doesn’t it? Well, are you surprised then, that in some cases even love becomes tradeable? You shouldn’t be, cash is king isn’t it? All in all, I have set on a path of teasing something out of you about money so that I can inform you about a new report I read this week that highlights some other negative effects of Artificial Intelligence (AI), supporting the notion that the love for money without ethics is all but dangerous.
According to Spherical Insights, a reputed market research and consulting firm with presence in the United States of America (USA) and India, the AI market size was valued at USD 68.5 Billion in 2022, and is set to grow at 32.5 per cent annually, expected to reach USD 2,760.3 Billion by 2032.
But the impressive market size of the AI market should not lead to digital recreations of dead people, for example, just for the cash. Ethicists are concerned that digital recreation of dead people needs urgent regulation, arguing that “deadbots” could cause psychological harm to, and even “haunt”, their creators and users.
Advertisement
Researchers from the University of Cambridge’s Leverhulme Centre for the Future of Intelligence (LCFI) have argued in a paper that such services could let users upload their conversations with dead relatives to “bring grandma back to life” in the form of a chatbot.
One of the study’s co-authors, Dr Katarzyna Nowaczyk-Basinska has stressed that “Rapid advancements in generative AI mean that nearly anyone with Internet access and some basic knowhow can revive a deceased loved one,”……and that “This area of AI is an ethical minefield.”
“It’s important to prioritise the dignity of the deceased, and ensure that this isn’t encroached on by financial motives of digital afterlife services, for example.” The study suggests that one risk is companies who monetise their digital legacy services through advertising.
Advertisement
As explained in the May 6, 2023 edition of this column, AI ethics matter, and that responsible behaviour is necessary in technology development, and in making the cash too. Maybe it is appropriate at this time to revisit portions of that piece.
“In the course of the week, a short story about the resignation of Geoffrey Hinton from Google got me thinking a lot about ethics in the technology space. A story I came across about his resignation had this headline: Godfather of AI, Geoffrey Hinton, quits Google and warns over dangers of misinformation.
Okay. Let me now break it down a bit further, and explain the true impact of this story, and why it made that impression on me. First off, Hinton is not new to the technology world at all. In fact, if you have been an ardent reader of this column, you may have come across a few of his innovations mentioned in past editions.
Advertisement
He is often touted as the godfather of Artificial Intelligence (AI) because of his pioneering work on neural networks. It is common knowledge that Hinton, together with two of his students at the University of Toronto, built a neural network in 2012, which has pioneered current systems and applications, such as ChatGPT”.
Well, that resignation got a lot of people talking. Both The Guardian newspaper of the United Kingdom, and the New York Times of the United States of America, were in agreement that Hinton was leaving due to concerns over the flood of misinformation, and “the possibility for AI to upend the job market, and the ‘existential risk’ posed by the creation of a true digital intelligence”.
In an interview with the New York Times, Hinton stated that until 2022 he was confident that Google had been a “proper steward” of the technology he had pioneered, but his confidence dipped once Microsoft started incorporating a chatbot into its Bing search engine, and Google became concerned about the risk to its search business.
This week, the Cambridge University ethicists are also warning that there are psychological consequences if AI developers of “deadbots” do not operate within the tenets of good ethical conduct.
Advertisement
“No re-creation service can prove that allowing children to interact with ‘deadbots’ is beneficial or, at the very least, does not harm this vulnerable group,” the study stressed. To preserve the dignity of the dead, as well as the psychological well-being of the living, the researchers suggest a suite of best-practices which may even require regulation to enforce.
Money should not be the root of any evil!
botabil@gmail.com