A recent experiment by Coinbase that used ChatGPT to do an audit of tokens showed how close bots are to becoming part of the security stack.
Everyone is trying out ChatGPT, even exchanges for cryptocurrencies. Coinbase recently used AI to check the accuracy of ChatGPT’s token security review, which is a requirement for all tokens that are listed on the exchange.
The very popular AI tool gave 12 times the same results as the manual review after looking at 20 different smart contracts.
But in five of the eight misses, a high-risk asset was mistakenly labeled as low-risk, which is the worst case.
The experiment also revealed that the AI produced inconsistent results on occasion, with the same prompt eliciting varying responses, particularly when switching from one ChatGPT iteration to the next.
Nonetheless, the Coinbase team is optimistic that ChatGPT can be made accurate enough to be used as a secondary quality assurance check with additional rapid engineering.
Other experts in crypto-security agreed with Coinbase that the tool could be used to provide more security.
AI security auditors?
The role of security auditors is crucial in ensuring the safety and security of information systems. However, advancements in artificial intelligence (AI) have raised questions about the possibility of replacing manual security auditors with AI technology. While AI can certainly augment the work of human auditors, it is unlikely that it will replace them entirely.
One of the biggest challenges of relying solely on AI for security auditing is that AI systems can only operate within the confines of their programming. Human auditors, on the other hand, are capable of using their intuition and experience to identify potential threats and vulnerabilities that may not be detectable by AI.
The post ChatGPT and the future of security auditing in crypto appeared first on NFT News Pro.