Exploring Thesis Topics and Technological Surprises in Research

Slide Note
Embed
Share

Prof. Neil Rowe's presentation covers thesis topics, training pipelines for naval aviators and flight officers, significant binary correlations related to candidate success attributes, and experiments on anticipating technological surprises using evolutionary algorithms. The content delves into a range of subjects including academic data, aircraft evolution simulations, and surprising vehicle types. For more information, visit the provided links.


Uploaded on Oct 06, 2024 | 0 Views


Download Presentation

Please find below an Image/Link to download the presentation.

The content on the website is provided AS IS for your information and personal use only. It may not be sold, licensed, or shared on other websites without obtaining consent from the author. Download presentation by click this link. If you encounter any issues during the download, it is possible that the publisher has removed the file from their server.

E N D

Presentation Transcript


  1. Slides illustrating some thesis topics Prof. Neil Rowe GE-328, (831) 656-2462 ncrowe@nps.edu Papers at http://faculty.nps.edu/ncrowe

  2. Training Pipelines Student Naval Aviator Student Naval Flight Officer

  3. Significant binary correlations with candidate success Success-related attribute Phase Possible nonnull values 3 Nonnull occurs Positive corrs. 0 5 Negative corrs. 0 0 NGCode RetestStatus PRE ASTB 12 strings Never, 30Days, 90Days, 180Days, Never1992, Resume None Floating point Floating point Complete, Disenroll, Closing Numbers 0.0 to 1.0 String String Number 0.0 to 1.0 Number 0.0 to 1.0 Integer Integer Floating point G, UI, NG, AT, TG, UA Floating point Floating point Floating point G, AT, UI, NG, MA, J Floating point Floating point Floating point Floating point Floating point G, AT, UI, NG, UU, TG, SQ, UIT Floating point Complete, Active, Attrite 20 strings Number Number Number Number Number Floating point Floating point Number Number 0.0 to 1.0 1761 8540 ExamineeStatus Number of ASTB1-5 Number of ASTBE IFS_STATUS IFS_STATUS_NUM IFS_DISENROLLMENT_DESCRIPTION IFS_USNA_PFP IFS_ACAD_FAIL IFS_FLT_FAIL API_NSS API_Test_FAILS Count of nonnull values for the API phase Pri (status in training) Count of nonnull values for the PRI phase PRI academic average PRI flight average Int (status in training) Count of nonnull values for the INT phase INT academic average INT flight average Count of nonnull values for the ADV phase ADV academic average Adv (status in training) ADV flight average SYL_ST (syllabus status) STAT_RESN (reason for syllabus status) NSS_UNSATS OFFICIAL_ NMU NUM_RRU IPC FPC NSS Count of nonnull values for the FRS phase FRS_TW6_Grade FRS_TW6_Status ASTB ASTB ASTB IFS IFS IFS IFS IFS IFS API API API PRI PRI PRI PRI INT INT INT INT ADV ADV ADV ADV ADV ADV ADV ADV ADV ADV ADV ADV FRS FRS FRS None 14477 4564 13844 13834 5787 634 13844 13844 17401 17446 18596 14461 18596 13555 10664 5530 18596 8153 3301 18,596 9712 16005 4593 6292 260 3012 3012 3012 3012 3012 5221 18,596 1274 7682 0 3 3 3 5 5 0 0 7 17 8 6 4 16 17 19 1 26 22 7 29 21 1 33 5 0 1 1 1 0 3 1 37 29 36 0 7 7 0 0 13 17 9 5 2 8 7 2 1 4 3 1 9 4 11 10 4 1 9 2 0 0 0 0 0 0 4 13 8 7

  4. Anticipating technological surprise We did a experiment trying to find surprising vehicle types. Properties used: Weight (numeric): 10, 50 250, 1250, 6250 kilograms Volume (numeric): 1, 5, 25, 125 cubic meters Power (numeric): 1, 10, 100, 1000 horsepower Maneuverability interpreted as turn radius (numeric): 1, 2, 4, 8 meters Count (numeric): 1 3, 10 items Material (nonnumeric): wood, steel, plastic, composite Habitat (nonnumeric): overland, water-surface, underwater, air, space Shape (nonnumeric): car, ship, aircraft, box, sphere, plate We used an algorithm simulating biological evolution.

  5. Example output of an evolutionary algorithm rating 147.1222 | 22.36 kg | 1.0 m^3 | 10.0 hp | 1.0 m | 3.0 count | steel | space | plate rating 124.6611 | 22.36 kg | 125.0 m^3 | 10.0 hp | 1.0 m | 3.0 count | steel | space | plate rating 118.9851 | 22.36 kg | 25.0 m^3 | 10.0 hp | 1.0 m | 3.0 count | composite | space | plate rating 114.8994 | 22.36 kg | 25.0 m^3 | 10.0 hp | 1.0 m | 3.0 count | steel | space | plate rating 114.4094 | 111.8 kg | 5.0 m^3 | 1.0 hp | 1.0 m | 1.0 count | steel | pavement | plate rating 106.5988 | 237.74 kg | 5.0 m^3 | 1.0 hp | 1.0 m | 3.0 count | composite | water_surface | plate rating 99.2401 | 10.0 kg | 0.2 m^3 | 1.0 hp | 1.0 m | 3.0 count | wood | air | sphere rating 93.5332 | 50.0 kg | 5.0 m^3 | 1.0 hp | 1.0 m | 1.0 count | steel | water_surface | plate rating 92.0674 | 111.8 kg | 5.0 m^3 | 1.0 hp | 1.0 m | 1.0 count | steel | water_surface | plate rating 91.3755 | 50.0 kg | 5.0 m^3 | 10.0 hp | 1.0 m | 1.0 count | steel | pavement | box rating 88.4605 | 111.8 kg | 5.0 m^3 | 3.16 hp | 1.0 m | 1.0 count | steel | water_surface | plate rating 80.3013 | 6250.0 kg | 125.0 m^3 | 1.0 hp | 2.0 m | 3.0 count | steel | space | plate rating 79.2743 | 50.0 kg | 0.2 m^3 | 1.0 hp | 1.0 m | 3.0 count | composite | air | sphere rating 78.5877 | 6250.0 kg | 11.18 m^3 | 1.0 hp | 2.0 m | 3.0 count | steel | space | plate rating 77.9443 | 50.0 kg | 5.0 m^3 | 1.0 hp | 1.0 m | 1.0 count | composite | water_surface | plate rating 76.9223 | 1250.0 kg | 125.0 m^3 | 1.0 hp | 2.0 m | 3.0 count | steel | space | plate rating 76.7228 | 111.8 kg | 5.0 m^3 | 1.0 hp | 1.0 m | 1.0 count | composite | water_surface | plate

  6. NAWC-WD mission-planning system design (SV-1/2) Client/Server Environment PWCT (e.g., evolved STAMS & JIFCS) Battel Engagement Mission Planning Elements + Readiness Metrics Machine Intelligent Algorithms Tactical Feeds Decision Support (Tablets) Thresher B a t t l e R e a d i n e s s E n g a g e m e n t M a t r i x Track Analysis, Track Accuracy & Weapons Pairing Selection Support Match Data Collection, Track Analysis, Processing & Display (High Side) Scenario Processing, BREM Selection & Display (Low Side) Scenario FADE Heterogeneous Solutions DB Capture BVI Blue Time1 Readiness Other Intelligence Weapons Pairing OPT Time2 Readiness Low Probability of Mission Success DDG High Probability of Mission Success CG MTC2 Environment CVN Blue Decision Mission Planning Plan Proceed Composite Commander Execution & Battle Engagement Track Decision Support Plan Mission Management Other Updates Intelligence & Tactical Feeds Other Intelligence Weapons Processing, Selection & Display Recommendations Other Decision Support Sources Dynamic Mission Management Support Machine Intelligent Algorithms (e.g., evolved NOMS, MAPEM and VIPER) Client/Server Environment

  7. A tree for quick lookup of missile-plan simulation results Start b_off_miss_salvo _size < threshold_ bomss1 b_off_miss_s alvo_size < threshold_ bmoss3 b_off_miss_ salvo_size < threshold_ bmoss4 b_off_miss_sa lvo_size < threshold_ bomss2 b_off_miss_s alvo_size > threshold_ bmoss4 b_off_miss_sa lvo_size > threshold_ rss4 r_ship_surv < threshold_ rss1 b_off_miss_s alvo_size < threshold_ rss2 b_off_miss_sa lvo_size < threshold_ rss3 b_off_miss_ salvo_size < threshold_ rss4 Figure 1: Part of the first two levels of an example index tree.

  8. Plan for data forwarding by the warfighter (edge)

  9. Testing of factors for anomalous aircraft This used a sample of 110 million records of aircraft satellite-surveillance data.

  10. Thwarting adversarial machine-learning Develop methods for thwarting adversary attempts to fool our machine-learning algorithms, especially neural-network algorithms (work sponsored by National Reconnaissance Office). Focus on data for combat identification of aircraft, something we have already studied and have data for. Experimentally manipulate the data to see how little is necessary to fool machine learning. Propose countermeasures such as: (1) run more than one kind of machine learning method; (2) train using a secret set of more-varied data; (3) use a broader range of inputs; (4) vary the training parameters.

  11. Foiling machine learning: the stop sign example These were all misclassified as either speed limit 45 signs or yield signs. Stickers were affixed to a real stop sign after lengthy experimentation with a neural net trained to recognize stop signs to find the smallest modifications that would cause it to fail. The neural net is clearly not using the features that humans use to identify stop signs.

  12. Our honeypot design for power plants

  13. Machine learning of cyberattacks Do machine learning on cyberattack data from honeypots (sponsored by the NPS Foundation). Plan: Set up honeypots in the cloud, run honeypots for a while, collect cyberattack, use machine learning to learn the new attack patterns and their clues. We have experience with running honeypots, but not in the cloud, and we need lots more data. Machine learning with neural networks will be our first try since they have made impressive strides lately. However, there are other machine-learning methods besides neural networks to evaluate, and decisions about the inputs.

  14. Example spoof Web exploits (the count listed first) 10 WEB-PHP xmlrpc.php post attempt 131.178.5.110 10 WEB-PHP xmlrpc.php post attempt 132.248.103.108 10 WEB-PHP xmlrpc.php post attempt 151.36.102.85 10 WEB-PHP xmlrpc.php post attempt 159.148.132.143 10 WEB-PHP xmlrpc.php post attempt 193.230.177.34 10 WEB-PHP xmlrpc.php post attempt 195.250.24.66 10 WEB-PHP xmlrpc.php post attempt 195.5.12.239 12 WEB-PHP xmlrpc.php post attempt 200.29.167.105 10 WEB-PHP xmlrpc.php post attempt 200.93.229.210 10 WEB-PHP xmlrpc.php post attempt 201.144.178.179 10 WEB-PHP xmlrpc.php post attempt 201.15.239.10 10 WEB-PHP xmlrpc.php post attempt 201.18.137.170 10 WEB-PHP xmlrpc.php post attempt 202.107.204.207 10 WEB-PHP xmlrpc.php post attempt 202.108.248.58

  15. Example attack data, page 2 (drill down on day) Snort Alert Daily Totals, Jan29 Feb4 BAD- TRAFFIC ICMP INFO MS-SQL NETBIOS SCAN SHELLCODE 1 WEB-IIS WEB-PHP 312 329 11538 18602 17028 46437 339 312 226 226 435 0 0 0 0 0 0 0 0 0 0 0 279 0 28 4 0 289 0 31 7 0 2 0 0 11466 15618 16727 46397 282 1 40 2 0 29 0 0 253 0 24 8 0 23 1 0 198 0 22 3 0 3 0 0 206 0 16 2 0 2 0 0 395 0 25 6 0 9 0 0 2952 26 1 0 1 0 0 245 36 7 0 13 0 0 0 19 5 0 16 0 0 1 20 10 0 26 0 0 0 0

  16. Example attack data, page 3 (drill down on alert) ICMP-Destination- Unreachable-Host- Unreachable ICMP-Destination- Unreachable- Network- Unreachable ICMP-Destination- Unreachable-Port- Unreachable ICMP-Destination- Unreachable- Protocol- Unreachable ICMP-Echo-Reply 0 0 2 0 ICMP-PING- CyberKit-2.2- Windows ICMP-PING-Sun- Solaris ICMP-PING- Windows ICMP-PING 244 254 320 278 ICMP-Time-To- Live-Exceeded-in- Transit ICMP-redirect-host 0 0 13 15 ICMP-redirect-net 0 0 29 16 ICMP-traceroute 1 0 0 0 INFO-FTP-Bad- login INFO-web-bug-1x- 1-gif-attempt 1 0 6458 11694 9438 31381 0 0 0 0 1 0 0 62 225 129 5472 0 0 0 0 0 0 10 4567 3369 3654 6092 0 0 0 10 6 0 0 0 4 2 31 0 0 0 0 0 0 24 0 15 0 25 0 16 0 15 0 10 0 33 33 25 15 17 0 0 0 0 5 0 0 0 0 0 0 0 0 0 0 0 4 0 8 0 0 0 372 0 241 32 257 0 229 0 183 0 186 0 355 0 0 0 0 0 3070 1707 31 2 245 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1422 0 0 0 0 0 2952 0 0 1 0 0 0 1 0 0 0 0

  17. Comparing versions of executables Analyze differences between different versions of the same executable (sponsored by the Defense Acquisition community). Goals: Detect possibly fraudulent or malicious software from its statistics. Methods: Find consistent mappings of parts of one executable to another. Consider sequences of bytes, sequences of every 2nd byte, sequences of every 4th byte, and sequences of every 8th byte. Find the most consistent mappings. Graphical displays can be very helpful in showing the results of analysis.

  18. Typical entropy pattern of an executable file (on 512-byte 8-bit histogram) Header Machine instructions Data and links

  19. Example comparison of two executables Colors show matching blocks, with some color reuse.

  20. How versions of cdfview.dll are related Columns represent versions released simultaneously for different operating systems.

  21. We can see malicious files (red) in context

  22. Count on disk usage over time

  23. Week and hour usage of a disk can classify users Disk 29: Traditional business; disk 403: home user; disk 695: server; disk 855: evening business; disk 994: evening business

  24. Finding the associates of n crowe on his drive (numbers represent distance ) n crowe|jblum@msn.com|0.1785 n crowe|begemenozkan@hotmail.com|0.1833 n crowe|dtwahl@nps.edu|0.2 n crowe|ncrowe@ern.nps.edu|0.2142 n crowe|jo blum|0.2168 n crowe|on|0.2221 se turner|e turner|0.2251 n crowe|miller|0.2686 n crowe|ncrowe@virginia.nps.edu|0.2727 n crowe|ejsjober@nps.edu|0.275 n crowe|miller@cs.umd.edu|0.2825 n crowe|mn kolsch|0.2844 n crowe|jh wilson|0.2856 n crowe|rowe|0.2906 n crowe|km squire|0.2972 n crowe|shcalfee@nps.edu|0.3 se turner|suzanne|0.3207 n crowe|neil rowe|0.047 n crowe|groby@nas.edu|0.0535 n crowe|j sundram|0.0693 n crowe|groby|0.070 n crowe|831-656-2462|0.0745 n crowe|pphua@nps.edu|0.0861 n crowe|neil|0.1009 n crowe|p phua|0.1161 n crowe|jonathan r|0.1224 n crowe|jhwilson@fea.net|0.1229 n crowe|joblum@msn.com|0.1315 n crowe|g singh|0.1386 n crowe|r gamble|0.1438 n crowe|jonathan_r@hotmail.com|0.1594 n crowe|idikmen@nps.edu|0.1680 n crowe|j blum|0.1711

  25. Mexican drive similarities from personal names

  26. Good opportunities for defensive deception with information systems a) delays b) false error messages c) flooding an attacker with information d) baiting an attacker e) camouflage of a system f) fake data files and network nodes

  27. A fake directory system

  28. Will deception hurt us more than them?

  29. Comparisons of different deception effects Test type day 1 Test type day 2 Data source Logarithm the counts day 1 to day 2 of of Number subjects having this combination of ratio No deception / uninformed No deception / informed Deception uninformed Deception informed No deception / uninformed No deception / informed Deception uniformed Deception/ informed Deception uninformed No deception / uniformed No deception / uninformed No deception / uninformed Deception uninformed No deception / uninformed No deception / uninformed No deception / uninformed / Keylogger -0.339 (1.290) 40 Keylogger +0.013 (0.522) 29 / Keylogger -0.375 (1.845) 28 / Keylogger +0.345 (1.025) 26 / Bash commands -0.302 (0.916) 20 Bash commands -0.363 (1.127) 5 / Bash commands +0.096 (1.528) 7 Bash commands -0.620 (0.346) 6

  30. Cyberwarfare strategy and tactics topics CW1: Cyberweapons are usually effective only once. When are good times to use them? CW2: Cyberweapons aren t very reliable how many do we need to use together to get a desired success probability? CW3: How can we build reversible cyberweapons? Ransomware provides some ideas. CW4: How can be get better attribution of cyberweapons to countries?

  31. Reversible attacks, visually Clandestine cache of information User Deception engine Data interceptor Locked- up operating system Stnemucod Mod(128,d ata)

  32. Reversible cyberattacks Another concern: Cyberattacks are hard to repair since damage can be widespread and hard to find. We should encourage more-ethical cyberattacks. Reversibility helps. Examples: (1) encryption of critical programs or data (2) diversion (3) obfuscation (4) deception about status. Cyberattacks can be much more reversible than traditional military attacks.

  33. The attribution problem Attribution of cyberattacks to a country or group is difficult. Backtracing requires cooperation of a wide variety of system administrators, something hard to get. IPv6 may help but addresses can still be spoofed. Stylistic analysis of documents and code can suggest the authors. Can espionage support backtracing? How can we attribute an adversary convincingly for world public opinion?

Related


More Related Content