- Artificial Intelligence: Principles and Techniques
- Machine Learning
- Machine Learning with Graphs
- Probabilistic Graphical Models
- Decision Making Under Uncertainty
- Advanced Topics in Sequential Decision Making
- RL and Sequential Decision Making
- AGI and Accelerating Scientific Discovery
- Programming Paradigms for AI
- Natural Language Processing with Deep Learning(fast.ai)
- Deep Multi-Task and Meta Learning
- Probabilistic Programming and Models of Cognition
- AI and Healthcare
We cover, 30,000 view of the whole field. Dynamic Programming, NNs. etc.
AI can be viewed as a spectrum, covering low-level intelligence(e.g instantenous image classification) And high level intelligence (human thinking hard about the next chess move. It also covers specialized algorithms related to search problems, markov decision processes, adversarial games, constraint satisfaction problems, bayseian, high-level logic (all the rage in the 70s with SAT Solver).
- GoG - A gametree search and SAT based decision making framework to acheive strategic resilience
- Path-Finder - Path finder prolog
- LISP_Basics - Lisp practice programs for learning basics
- Rock-Paper-Scissors - Hand detection gesture system written in LabView
- Robo-Soccor - Lego based Autonomous Robot Capable of playing soccer using a spray painted Squash ball.
see Machine learning repo. for details
Add More Intros here:
https://ai.stanford.edu/stanford-ai-courses/
The first paradigm is a logic programming engine based on unification and depth-first search. The second paradigm is imperative: the assert and retract operations which allow a program to add and remove program clauses. Prolog dates from 1972, which makes it an old language. Recent developments in modeling languages based on advanced search algorithms advance both the logic programming and imperative programming sides. Modern Prolog implementations have added some of these advances, e.g., support for constraint programming and a module system
lisp-practice-tasks
- Define a function (even ) which returns the subset (a list) of even numbers contained in a given numeric, possibly nested list. The result must maintain the order of the even numbers as they appeared in the original list. Example: (even ‘(1 2 (3 4) -4)) returns (2 4 -4).
- Write a function (OccurencesInTree ) which counts the number of occurrences of the value in a numeric tree. Use a breadth first approach when traversing the tree. Example: (OccurencesInTree 3 ‘(((1)(2))(5)(3)((8)3)) returns 2.
- Define a function (SumIfNot ) which returns the sum of all elements in list2 that do not appear in list1. Both lists may be nested lists. Example: (SumIfNot ‘(1 8 (2)) ‘(1 (3 (5)) 7 9)) returns 24.
For learning more see: http://www.paulgraham.com/rootsoflisp.html
Resilient Distributed Datasets (RDD) are the primary abstraction in Spark – a fault-tolerant collection of elements that can be operated on in parallel two types of operations on RDDs: transformations and actions Examples of transformations include map,filter, group by and join operations transformations are lazy (not computed immediately) Actions include count,collect, save etc. The transformed RDD gets recomputed when an action is run on it (default) • however, an RDD can be persisted into storage in memory or disk
Concurrency should not be confused with parallelism. Concurrency is a language concept and parallelism is a hardware concept. Two parts are parallel if they execute simultaneously on multiple processors. Concurrency and parallelism are orthogonal: it is possible to run concurrent programs on a single processor (using preemptive scheduling and time slices) and to run sequential programs on multiple processors (by parallelizing the calculations).