Skip to content

Commit 8c966ab

Browse files
natashadsouzaycool
authored andcommitted
DOCS: added Apollo 3.5 technical tutorial, software architecture, and dual IPC how to
1 parent d1126cb commit 8c966ab

24 files changed

+409
-5
lines changed

README.md

+2-2
Original file line numberDiff line numberDiff line change
@@ -182,7 +182,7 @@ Apollo 3.0's main focus is to provide a platform for developers to build upon in
182182
* Monitor
183183
* Additional drivers to support Hardware
184184

185-
[**Apollo 3.5:**](docs/quickstart/apollo_updated_quick_start.md)
185+
[**Apollo 3.5:**](docs/quickstart/apollo_3_5_quick_start.md)
186186

187187
Apollo 3.5 is capable of navigating through complex driving scenarios such as residential and downtown areas. The car now has 360-degree visibility, along with upgraded perception algorithms to handle the changing conditions of urban roads, making the car more secure and aware. Scenario-based planning can navigate through complex scenarios including unprotected turns and narrow streets often found in residential areas and roads with stop signs.
188188

@@ -244,7 +244,7 @@ If at this point, you do not have a Hardware setup, please go to [Without Hardwa
244244

245245
* [Apollo 3.0 QuickStart Guide](docs/quickstart/apollo_3_0_quick_start.md)
246246

247-
* [Apollo 3.5 QuickStart Guide](docs/quickstart/apollo_updated_quick_start.md)
247+
* [Apollo 3.5 QuickStart Guide](docs/quickstart/apollo_3_5_quick_start.md)
248248

249249

250250
### Without Hardware:

docs/howto/README.md

+1
Original file line numberDiff line numberDiff line change
@@ -43,6 +43,7 @@
4343
- [How to Run MSF Localization Module On Your Local Computer](how_to_run_MSF_localization_module_on_your_local_computer.md)
4444
- [How to train Prediction's MLP model](how_to_train_prediction_mlp_model.md)
4545
- [How to use the navigation mode of Apollo 2.5](how_to_use_apollo_2.5_navigation_mode.md)
46+
- [How to setup Dual IPCs for Apollo 3.5](how_to_setup_dual_ipc.md)
4647

4748
### Chinese versions
4849

docs/howto/how_to_setup_dual_ipc.md

+168
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,168 @@
1+
# How to set up Apollo 3.5's software on Dual-IPCs
2+
3+
The modules of Apollo 3.5 are separately launched from two industrial PCs. This guide introduces the hardware/software setup on two parallel IPCs.
4+
5+
## Software
6+
7+
- nVidia GTX 1080 Driver
8+
- Apollo 3.5
9+
- Apollo Kernel RT (based upon linux kernel 4.4.32)
10+
- Linux precision time protocol
11+
12+
## Runtime Framework
13+
- CyberRT
14+
15+
## Installation
16+
17+
There are two steps in the installation process:
18+
- Clone and install linux PTP code on both of IPCs
19+
- Clone Apollo 3.5 Github code on both of IPCs
20+
21+
### Clone and install linux PTP
22+
Install PTP utility and synchorize the system time on both of IPCs.
23+
24+
```sh
25+
git clone https://github.com/richardcochran/linuxptp.git
26+
cd linuxptp
27+
make
28+
29+
# on IPC1:
30+
sudo ./ptp4l -i eth0 -m &
31+
32+
# on IPC2:
33+
sudo ./ptp4l -i eth0 -m -s &
34+
sudo ./phc2sys -a -r &
35+
```
36+
37+
### Clone Apollo 3.5
38+
Install Apollo 3.5 on local ubuntu machine
39+
```sh
40+
git clone https://github.com/ApolloAuto/apollo.git
41+
```
42+
43+
### Build Docker environment
44+
Refer to the [How to build and release docker](https://github.com/ApolloAuto/apollo/blob/master/docs/howto/how_to_build_and_release.md) guide
45+
46+
### Run CyberRT on both of IPCs
47+
1. Change directory to apollo
48+
```sh
49+
cd apollo
50+
```
51+
2. Start docker environment
52+
```sh
53+
bash docker/scripts/dev_start.sh
54+
```
55+
3. Enter docker environment
56+
```sh
57+
bash docker/scripts/dev_into.sh
58+
```
59+
4. Build Apollo in the Container:
60+
```sh
61+
bash apollo.sh build_opt_gpu
62+
```
63+
5. Start CyberRT and Dreamview:
64+
```sh
65+
bash scripts/bootstrap.sh
66+
```
67+
68+
6. Open Chrome and go to localhost:8888 to access Apollo Dreamview:
69+
70+
- on IPC1
71+
72+
73+
74+
The header has 3 drop-downs, mode selector, vehicle selector and map selector.
75+
76+
![IPC1 Task](images/IPC1_dv.png)
77+
78+
79+
Select mode, for example "ipc1 Mkz Standard Debug"
80+
81+
![IPC1 mode](images/IPC1_mode.png)
82+
83+
84+
85+
Select vehicle, for example "Mkz Example"
86+
87+
![IPC1 car](images/IPC1_car.png)
88+
89+
90+
91+
Select map, for example "Sunnyvale Big Loop"
92+
93+
![IPC1 map](images/IPC1_map.png)
94+
95+
96+
97+
All the tasks that you could perform in DreamView, in general, setup button turns on all the modules.
98+
99+
![IPC1 setup](images/IPC1_setup.png)
100+
101+
102+
All the hardware components should be connected to IPC1 and the modules, localization, perception, routing, recorder, traffic light and transform, are allocated on IPC1 also.
103+
104+
Module Control on sidebar panel is used to check the modules on IPC1
105+
106+
![IPC1 check](images/IPC1_check.png)
107+
108+
In order to open dreamview on IPC2, user must stop it on IPC1 by using the below command:
109+
```sh
110+
# Stop dreamview on IPC1
111+
bash scripts/bootstrap.sh stop
112+
```
113+
114+
- on IPC2
115+
116+
Start dreamview on IPC2 by using the below command:
117+
118+
```sh
119+
# Start dremview on IPC2
120+
bash scripts/bootstrap.sh
121+
```
122+
123+
Select mode, vehicle and map on dreamview as the same operations on IPC1
124+
![IPC2 Task](images/IPC2_setup.png)
125+
126+
127+
The modules - planning, prediction and control are assigned on IPC2.
128+
129+
Module Control on sidebar panel is used to check the modules on IPC2
130+
131+
![IPC2 modules](images/IPC2_check.png)
132+
133+
134+
[See Dreamview user's guide](https://github.com/ApolloAuto/apollo/blob/master/docs/specs/dreamview_usage_table.md)
135+
136+
7. How to start/stop Dreamview:
137+
138+
The current version of Dreamview shouldn't run on the different IPCs simultaneously, so the user must perform it alternatively on IPC1 or IPC2.
139+
140+
The code below can be used to stop Dreamview on IPC2 and start it on IPC1.
141+
142+
```sh
143+
# Stop Dreamview on IPC2
144+
bash scripts/bootstrap.sh stop
145+
146+
# Start Dreamview on IPC1
147+
bash scripts/bootstrap.sh
148+
```
149+
150+
8. Cyber monitor
151+
152+
Cyber monitor is CyberRT's tool used to check the status of all of the modules on local and remote machines. The User may observe the activity status of all the hardware and software components and ensure that they are working correctly.
153+
154+
## Future work (To Do)
155+
- Multiple Dreamviews may run simultaneously
156+
- Fix a bug that modules are still greyed-out after clicking the setup button. Users may check each modules' status by using the command
157+
```sh
158+
ps aux | grep mainboard
159+
```
160+
161+
162+
# License
163+
164+
[Apache license](https://github.com/natashadsouza/apollo/blob/master/LICENSE)
165+
166+
167+
168+

docs/howto/images/IPC1_car.png

+3
Loading

docs/howto/images/IPC1_check.png

+3
Loading

docs/howto/images/IPC1_dv.png

+3
Loading

docs/howto/images/IPC1_map.png

+3
Loading

docs/howto/images/IPC1_mode.png

+3
Loading

docs/howto/images/IPC1_setup.png

+3
Loading

docs/howto/images/IPC2_check.png

+3
Loading

docs/howto/images/IPC2_setup.png

+3
Loading

docs/quickstart/README.md

+1-1
Original file line numberDiff line numberDiff line change
@@ -2,7 +2,7 @@
22

33
## Apollo 3.5
44

5-
- [Apollo 3.5 quick start](apollo_master_quick_start.md)
5+
- [Apollo 3.5 quick start](apollo_3_5_quick_start.md)
66
- [Apollo 3.5 hardware system installation guide](apollo_3_5_hardware_system_installation_guide.md)
77

88
## Apollo 3.0
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,119 @@
1+
# Apollo 3.5 Software Architecture
2+
3+
Core software modules running on the Apollo 3.5 powered autonomous vehicle include:
4+
5+
- **Perception** — The perception module identifies the world surrounding the autonomous vehicle. There are two important submodules inside perception: obstacle detection and traffic light detection.
6+
- **Prediction** — The prediction module anticipates the future motion trajectories of the perceived obstacles.
7+
- **Routing** — The routing module tells the autonomous vehicle how to reach its destination via a series of lanes or roads.
8+
- **Planning** — The planning module plans the spatio-temporal trajectory for the autonomous vehicle to take.
9+
- **Control** — The control module executes the planned spatio-temporal trajectory by generating control commands such as throttle, brake, and steering.
10+
- **CanBus** — The CanBus is the interface that passes control commands to the vehicle hardware. It also passes chassis information to the software system.
11+
- **HD-Map** — This module is similar to a library. Instead of publishing and subscribing messages, it frequently functions as query engine support to provide ad-hoc structured information regarding the roads.
12+
- **Localization** — The localization module leverages various information sources such as GPS, LiDAR and IMU to estimate where the autonomous vehicle is located.
13+
- **HMI** - Human Machine Interface or DreamView in Apollo is a module for viewing the status of the vehicle, testing other modules and controlling the functioning of the vehicle in real-time.
14+
- **Monitor** - The surveillance system of all the modules in the vehicle including hardware.
15+
- **Guardian** - A new safety module that performs the function of an Action Center and intervenes should Monitor detect a failure.
16+
17+
18+
```
19+
Note: Detailed information on each of these modules is included below.
20+
```
21+
22+
The interactions of these modules are illustrated in the picture below.
23+
24+
![img](images/Apollo_3.0_SW.png)
25+
26+
Every module is running as a separate CarOS-based ROS node. Each module node publishes and subscribes certain topics. The subscribed topics serve as data input while the published topics serve as data output. The detailed interactions are described in the following sections.
27+
28+
## Perception
29+
30+
Apollo Perception 3.5 has following new features:
31+
32+
* **Support for VLS-128 Line LiDAR**
33+
* **Obstacle detection though multiple cameras**
34+
* **Advanced traffic light detection**
35+
* **Configurable sensor fusion**
36+
37+
The perception module incorporates the capability of using 5 cameras (2 front, 2 on either side and 1 rear) and 2 radars (front and rear) along with 3 16-line LiDARs (2 rear and 1 front) and 1 128-line LiDAR to recognize obstacles and fuse their individual tracks to obtain a final track list. The obstacle sub-module detects, classifies and tracks obstacles. This sub-module also predicts obstacle motion and position information (e.g., heading and velocity). For lane line, we construct lane instances by postprocessing lane parsing pixels and calculate the lane relative location to the ego-vehicle (L0, L1, R0, R1, etc.).
38+
39+
## Prediction
40+
41+
The prediction module estimates the future motion trajectories for all the perceived obstacles. The output prediction message wraps the perception information. Prediction subscribes to localization, planning and perception obstacle messages as shown below.
42+
43+
![img](images/pred.png)
44+
45+
When a localization update is received, the prediction module updates its internal status. The actual prediction is triggered when perception sends out its perception obstacle message.
46+
47+
## Localization
48+
49+
The localization module aggregates various data to locate the autonomous vehicle. There are two types of localization modes: OnTimer and Multiple SensorFusion.
50+
51+
The first localization method is RTK-based, with a timer-based callback function `OnTimer`, as shown below.
52+
53+
![img](images/localization1.png)
54+
55+
The other localization method is the Multiple Sensor Fusion (MSF) method, where a bunch of event-triggered callback functions are registered, as shown below.
56+
57+
![img](images/localization2.png)
58+
59+
## Routing
60+
61+
The routing module needs to know the routing start point and routing end point, to compute the passage lanes and roads. Usually the routing start point is the autonomous vehicle location. The `RoutingResponse` is computed and published as shown below.
62+
63+
![img](images/routing1.png)
64+
65+
## Planning
66+
67+
Apollo 3.5 uses several information sources to plan a safe and collision free trajectory, so the planning module interacts with almost every other module. As Apollo matures and takes on different road conditions and driving use cases, planning has evolved to a more modular, scenario specific and wholistic approach. In this approach, each driving use case is treated as a different driving scenario. This is useful because an issue now reported in a particular scenario can be fixed without affecting the working of other scenarios as opposed to the previous versions, wherein an issue fix affected other driving use cases as they were all treated as a single driving scenario.
68+
69+
Initially, the planning module takes the prediction output. Because the prediction output wraps the original perceived obstacle, the planning module subscribes to the traffic light detection output rather than the perception obstacles output.
70+
71+
Then, the planning module takes the routing output. Under certain scenarios, the planning module might also trigger a new routing computation by sending a routing request if the current route cannot be faithfully followed.
72+
73+
Finally, the planning module needs to know the location (Localization: where I am) as well as the current autonomous vehicle information (Chassis: what is my status).
74+
75+
![img](images/planning1.png)
76+
77+
## Control
78+
79+
The Control takes the planned trajectory as input, and generates the control command to pass to CanBus. It has five main data interfaces: OnPad, OnMonitor, OnChassis, OnPlanning and OnLocalization.
80+
81+
![img](images/control1.png)
82+
83+
The `OnPad` and `OnMonitor` are routine interactions with the PAD-based human interface and simulations.
84+
85+
## CanBus
86+
87+
The CanBus has two data interfaces as shown below.
88+
89+
![img](images/canbus1.png)
90+
91+
The first one is the `OnControlCommand` which is an event-based publisher with a callback function, which is triggered when the CanBus module receives control commands and the second one is `OnGuardianCommand`.
92+
93+
## HMI
94+
95+
Human Machine Interface or DreamView in Apollo is a web application that:
96+
- visualizes the current output of relevant autonomous driving modules, e.g. planning trajectory, car localization, chassis status, etc.
97+
- provides human-machine interface for user to view hardware status, turn on/off of modules, and start the autonomous driving car.
98+
- provides debugging tools, such as PnC Monitor to efficiently track module issues.
99+
100+
## Monitor
101+
102+
The surveillance system of all the modules in the vehicle including hardware. Monitor receives Data from different modules and passes them on to HMI for the driver to view and ensure that all the modules are working without any issue. In the event of a module or hardware failure, monitor sends an alert to Guardian (new Action Center Module) which then decides on which action needs to be taken to prevent a crash.
103+
104+
## Guardian
105+
106+
This new module is basically an action center that takes a decision based on the data that is sent by Monitor. There are 2 main functions of Guardian:
107+
- All modules working fine - Guardian allows the flow of control to work normally. Control signals are sent to CANBus as if Guardian were not present.
108+
- Module crash is detected by Monitor - if there is a failure detected by Monitor, Guardian will prevent Control signals from reaching CANBus and bring the car to a stop. There are 3 ways in which Guardian decides how to stop the car, and to do so, Guardian turns to the final Gatekeeper, Ultrasonic sensors,
109+
- If the Ultrasonic sensor is running fine without detecting an obstacle, Guardian will bring the car to a slow stop
110+
- If the sensor is not responding, Guardian applies a hard brake to bring the car to an immediate stop.
111+
- This is a special case, If the HMI informs the driver of an impending crash and the driver does not intervene for 10 seconds, Guardian applies a hard brake to bring the car to an immediate stop.
112+
113+
```
114+
Note:
115+
1. In either case above, Guardian will always stop the car should Monitor detect a failure in any module or hardware.
116+
2. Monitor and Guardian are decoupled to ensure that there is not a single point of failure and also that with a module approach, the action center can be modified to include additional actions without affecting the functioning of the surveillance system as Monitor also communicates with HMI.
117+
118+
```
119+

docs/specs/images/canbus1.png

+3
Loading

docs/specs/images/control1.png

+3
Loading

docs/specs/images/localization1.png

+3
Loading

docs/specs/images/localization2.png

+3
Loading

docs/specs/images/planning1.png

+3
Loading

docs/specs/images/pred.png

+3
Loading

docs/specs/images/routing1.png

+3
Loading

docs/technical_tutorial/README.md

+2-1
Original file line numberDiff line numberDiff line change
@@ -5,4 +5,5 @@ Refer these documents as a guide to all other relevant documents for the version
55
- [Apollo 3.0](apollo_3.0_technical_tutorial.md)
66
- [Apollo 3.0 cn](apollo_3.0_technical_tutorial_cn.md)
77
- [Apollo 2.5](apollo_2.5_technical_tutorial.md)
8-
- [Navigation mode cn](navigation_mode_tutorial_cn.md)
8+
- [Navigation mode cn](navigation_mode_tutorial_cn.md)
9+
- [Best Coding Practtice](best_coding_practice.md)

0 commit comments

Comments
 (0)