Today, mankind has to defend itself. There are too many weapons available. The control of nuclear weapons has long been a big problem. We do not need to use artificial intelligence for weapons in addition to conventional weapons.
The greatest value of artificial intelligence lies in its low cost, high efficiency, and intensiveness. It rapidly enhances its capabilities through big data and intelligent learning. The human grasp and control of artificial intelligence is weak and unknown. Today, a large number of drones and intelligent robots have been used for military operations. In the future, the so-called â€œweapons that can select targets and engage in war without human interventionâ€ will become unpredictable to human society. On the surface, these intelligent robots or drones will reduce the casualties of the soldiers, but who are the targets of these intelligent robots and drones? Mainly human.
The most complicated problem with artificial intelligence weapons lies in its cheapness and uncontrollability. Unlike nuclear weapons manufacturing, it requires extremely high technology, raw materials that are difficult to obtain, and high costs. Artificial intelligence weapons facilitate mass production and circulation through multiple channels. Recently we have seen the emergence of a civilian unmanned aircraft reconstructed on the battlefield of Syria, carrying a grenade-attacking drone. A drone and a rifle have similar prices and can be produced in large quantities and are easily accessible. .
The uncontrollability of artificial intelligence weapons also lies in the fact that even if a large technological nation produces weapons that can select targets and engage in war without human intervention, the possibility of such weapons being under the control of human beings is unknown, and they are involved in massive artificial intelligence. After a strong learning ability, it will choose how to attack the target. We know nothing about it. No one can guarantee that it will be effectively controlled and will not be copied by humans.
Therefore, when artificial intelligence weapons have not yet become the next â€œatomic bomb,â€ the great powers must first take up the responsibility to form a convention similar to the international agreement on the prohibition of chemical and biological weapons and prohibit the use of artificial intelligence in military affairs. It cannot rely solely on the "conscience" of the scientific community. If one day, a large number of small, low-cost, uncontrollable killing machines are flooded and even artificial intelligence weapons escape human supervision, it is too late to talk about management and control.
tcl , https://www.tclgroupss.com