Researchers Inject Computer Malware into DNA for the First Time

  In what seems to be an example of a futuristic blending of biological essence and machine logic, researchers at the University of Washington have demonstrated, for the first time, the ability to inject computer program malware into DNA sequencing.  The malware was then used to exploit various computer applications used in DNA sequencing.

After extensive analysis, key research findings include:

  • The ability to create adverse side-channel information leaks in several DNA sequencing technologies.
  • Bioinformatics applications used in DNA sequencing have information systems vulnerabilities, such as insecure function calls and buffer overflows, that allow an adversary to take control of the application or system.
  • Cybersecurity best practices are lacking in the computer coding and implementation of software applications used in the DNA processing.
  • Derivation of hypothetical DNA sequencing attack vectors with recommendations to mitigate potential attacks.

The findings suggest a need for increased cybersecurity awareness in the implementation of DNA sequencing technologies.

Bioinformatics applications are susceptibility to computer system vulnerabilities (such as the aforementioned buffer overflows) that are known to be the result of poor computer coding techniques.  For years other professional technology sectors (e.g., banking, energy, transportation) have made significant efforts to eliminated programming vulnerabilities that allow malware code execution in their computer systems.

Secure programming due diligence in the form of training, tools, and techniques are now required in the genome sequencing field where cyber attacks that once seemed too resource intensive and technically difficult for hacker to undertake are now a possibility.

The full detailed academic research paper is available at:

http://dnasec.cs.washington.edu/dnasec.pdf

Facebook Shut Down An Artificial Intelligence Program That Developed Its Own Language

  Deep learning uses neural networks to learn tasks that contain one or more hidden layers.  What is the nature of deep learning?  Is deep learning predictable?  More importantly, what are the consequences of deep learning in autonomous machines?  The link below, about an experiment at Facebook that took some unexpected turns, is a very interesting article that feeds into perceptions on either the benevolent or malevolent of artificial intelligence (AI).  Implementing AI raises questions of whether machine learning should be supervised by humans, partially supervised, or be completely autonomous.

http://www.msn.com/en-us/news/technology/facebook-shut-down-an-artificial-intelligence-program-that-developed-its-own-language/ar-AAp1wzQ?li=AA4Zoy&ocid=spartanntp

The Annual Technology Vectors brief has been published by the AFCEA International Technology Committee

  The Armed Forces Communications & Electronics Association (AFCEA) International Technology Committee has released an update of its annual presentation on current technology trends.

The briefing provides insights and expertise on emerging technology hot topics most relevant to Federal technology leaders and why these technologies require further scrutiny.

The technology vectors are featured in a concise knowledge base format and includes points of contact for questions and additional information.

Vector topics include elements and sub-elements surrounding cloud computing, smart/additive manufacturing, big data analytics, Apache Hadoop & Apache NiFi, advanced cybersecurity, quantum computing, and mobility/wireless communications.

The advanced cybersecurity areas include cyber supply chain anti-counterfeit measures, light-weight encryption for use in IoT devices, micro-segmentation protection capabilities in data centers, and artificial intelligence (AI) insertion for machine-to-machine security.

Requests for downloads of the presentation can be made at:

http://www.afcea.org/signal/resources/linkreq.cfm?id=114