When Man Group CEO Luke Ellis discusses his investment firm’s results with analysts, he chooses his words carefully. He knows better than anyone that the Machines are listening.
The jewel in man’s crown is his $ 39 billion AHL hedge fund group, including algorithms scour huge datasets for profitable signals that fuel investment decisions.
One of the hottest areas in this field is “natural language processing», A form of artificial intelligence where machines learn the intricacies of human speech. With NLP, quantitative hedge funds can systematically and instantly scratch central bank rhetoric, social media chats, and thousands of corporate profit calls every quarter for clues.
As a result, Mr Ellis’ quantitative colleagues coached him to avoid certain words and phrases to which algorithms may be particularly sensitive and could trigger a thrill in Man’s share price. It is much safer to use the word “but”, for example.
“There’s always been a cat-and-mouse game, with CEOs trying to be smart in their choice of words,” says Ellis. “But machines can pick up a verbal tick that a human might not even realize is a thing.”
It is a growing phenomenon. Machine downloads quarterly and annual reports in the United States – scratched by an algorithm rather than read by a human – have grown from around 360,000 in 2003 to 165 million in 2016, according to one recent article by the United States National Bureau of Economic Research. That represented 78% of all those downloads that year, up from 39% in 2003.
The paper – How to Talk When a Machine Listens: Business Disclosure in the Age of AI – emphasizes that companies want to present their activity in the best possible light. They gradually made the reports more machine-readable, for example by fine-tuning the formatting of the tables, as a result of this evolving analysis.
“More and more companies are realizing that the target audience for their mandatory and voluntary disclosures is no longer just analysts and human investors,” note authors Sean Cao, Wei Jiang, Baozhong Yang and Alan Zhang . “A substantial amount of stock buying and selling is triggered by recommendations made by robots and algorithms that process information with machine learning tools and natural language processing kits.”
However, in recent years, companies’ adaptation to reality algorithmic traders has taken a big step forward. The document revealed that since 2011, companies have subtly altered the language of reports and the way executives speak on conference calls, to avoid words that could set off red flags for automatic listening.
It is no coincidence that in 2011, Tim Loughran and Bill McDonald, two finance professors at the University of Notre-Dame, published for the first time a more detailed document, finance-specific dictionary which has become popular as a training tool for NLP algorithms.
Since 2011, words deemed negative in the Loughran-McDonald dictionary have declined markedly in corporate reports, while words deemed negative in the Harvard Psychosociological Dictionary – which remains popular among human readers – do not show such trend.
In addition, using voice analysis software, the authors of the Document from the National Bureau of Economic Research found that some executives even change their tone of voice on conference calls, in addition to the words they use.
“Executives of companies with higher expected machine readership show more positivity and excitement in their vocal tones, which justifies the anecdotal evidence that managers are increasingly looking for professional coaching to improve their vocal performance according to quantifiable measures, ”the newspaper said.
Some NLP experts claim that some companies’ investor relations departments even run multiple drafts through these algorithmic systems to see which one gets the best scores.
One word can say a lot. . .
Source: Loughran-McDonald dictionary
“Access to NLP tools has become an arms race between investors and management teams. We are finding that companies increasingly want access to the same firepower as hedge funds, ”says Nick Mazing, Research Director at Sentieo, a research platform. “We’re not far from someone on a call reading” we said goodbye to our profitability “versus” we recorded a loss “because it reads better in some NLP models.”
However, Mr Mazing said that NLP-based algorithms are also continually adjusted to reflect the growing obfuscation of corporate executives, so it ends up being a never-ending game of fruitless language acrobatics.
“Trying to ‘outsmart the algos’ is ultimately futile: Buyside users can immediately report sentence classification errors to the model so that any specific effort to sound positive on negative news won’t work for long,” says M Mazing.
This is because most sophisticated NLP systems do not rely on a static list of sensitive words and are designed both to identify problematic or promising word combinations and to learn the idiosyncratic styles of a CEO themselves, Mr. Ellis notes. For example, a CEO might use the word “challenge” regularly and its absence would be more telling, while one who never uses the word would send such a powerful signal by doing so.
Machines are still unable to pick up non-verbal cues, like a physical jerk before a response, “but it’s only a matter of time” before they can do that as well, Mr Ellis says.