Github Copilot is an AI-powered tool that basically provides an autocomplete for devs and engineers writing code. However, it doesn't come up with the code itself - it takes the code written by other engineers and devs and uses that to inform suggestion.
The issue is that it has legal and security/privacy risks. Some of the code it suggest may be inaccurate or buggy, leading to less secure code as there is nothing telling the AI that certain code is malicious.
This article goes into depth on what companies may be facing if their workforce uses this tool: https://www.kolide.com/blog/github-copilot-isn-t-worth-the-risk
What do you guys think of it? is it worth the efficiency if these risks are this transparent?
[link] [comments]
from hacking: security in practice https://ift.tt/bxy2Fwo
Comments
Post a Comment