Unreal engine lidar simulation

Unreal engine lidar simulation DEFAULT

Lidar Localization with Unreal Engine Simulation

Main Content

This example shows how to develop and evaluate a lidar localization algorithm using synthetic lidar data from the Unreal Engine® simulation environment.

Developing a localization algorithm and evaluating its performance in varying conditions is a challenging task. One of the biggest challenges is obtaining ground truth. Although you can capture ground truth using expensive, high-precision inertial navigation systems (INS), virtual simulation is a cost-effective alternative. The use of simulation enables testing under a variety of scenarios and sensor configurations. It also enables a rapid development iteration, and provides precise ground truth.

This example uses the Unreal Engine simulation environment from Epic Games® to develop and evaluate a lidar localization algorithm from a known initial pose in a parking scenario.

Set Up Scenario in Simulation Environment

Parking a vehicle into a parking spot is a challenging maneuver that relies on accurate localization. Use the prebuilt Large Parking Lot scene to create such a scenario. The Select Waypoints for Unreal Engine Simulation example describes how to interactively select a sequence of waypoints from a scene and how to generate a reference vehicle trajectory. This example uses a recorded reference trajectory obtained using the approach described in the linked example. First, visualize the reference path on a 2-D bird's-eye view of the scene.

% Load reference path data = load('ReferencePathForward.mat'); refPosesX = data.ReferencePathForward.refPosesX; refPosesY = data.ReferencePathForward.refPosesY; refPosesT = data.ReferencePathForward.refPosesT; sceneName = 'LargeParkingLot'; hScene = figure; helperShowSceneImage(sceneName); hold on scatter(refPosesX(:,2), refPosesY(:,2), [], 'filled', 'DisplayName', ...'Reference Path'); xlim([0 40]) ylim([-20 10]) legend hold off

Record and Visualize Sensor Data

Set up a simple model with a hatchback vehicle moving along the specified reference path by using the Simulation 3D Vehicle with Ground Following block. Mount a lidar on the roof center of a vehicle using the Simulation 3D Lidar block. Record and visualize the sensor data. The recorded data is used to develop a localization algorithm.

close(hScene) if ~ispc error("Unreal Engine Simulation is supported only on Microsoft"... + char(174) + " Windows" + char(174) + "."); end% Open model modelName = 'recordAndVisualize'; open_system(modelName); snapnow; % Run simulation simOut = sim(modelName);

The recorded sensor data is returned in the variable.

Develop Algorithm Using Recorded Data

In this example, you develop an algorithm based on point cloud registration. Point cloud registration is a common localization technique that estimates the relative motion between two point clouds to derive localization data. Accumulating relative motion like this over long sequences can lead to drift, which can be corrected using loop closure detection and pose graph optimization, as shown in the Build a Map from Lidar Data Using SLAM example. Since this example uses a short reference path, loop closure detection is omitted.

Extract the lidar sensor data and ground truth location and orientation provided by the Simulation 3D Lidar block. The ground truth location and orientation are provided in the world (scene) coordinate system. Extract the known initial pose from the ground truth data by using the function.

close_system(modelName); % Extract lidar sensor data ptCloudArr = helperGetPointClouds(simOut); % Extract ground truth [lidarLocation, lidarOrientation, simTimes] = helperGetLocalizerGroundTruth(simOut); % Compute initial pose initPose = [lidarLocation(1, :) lidarOrientation(1, :)]; initTform = helperPoseToRigidTransform(initPose);

Develop a lidar localization algorithm by using the extracted sensor data. Use a object to process and store odometry data. organizes odometry data into a set of views, and the associated connections between views, where:

  • Each view has an absolute pose describing the rigid transformation to some fixed reference frame.

  • Each connection has a relative pose describing the rigid transformation between the two connecting views.

The localization estimate is maintained in the form of the absolute poses for each view with respect to the scene reference frame.

Use a object to display streaming point cloud data in the loop as it is registered. Transform the viewing angle to a top view. The orange cuboid and path show the localization position estimated by the algorithm. The green path shows the ground truth.

% Create a view set vSet = pcviewset; absPose = initTform; relPose = rigid3d; viewId = 1; % Define rigid transformation between the lidar sensor mounting position% and the vehicle reference point. lidarToVehicleTform = helperPoseToRigidTransform(single([0 0 -1.57 0 0 0])); % Process the point cloud frame ptCloud = helperProcessPointCloud(ptCloudArr(1)); % Initialize accumulated point cloud map ptCloudAccum = pctransform(ptCloud, absPose); % Add first view to the view set vSet = addView(vSet, viewId, absPose, 'PointCloud', ptCloud); % Configure display xlimits = [ 0 50]; ylimits = [-25 10]; zlimits = [-30 30]; player = pcplayer(xlimits, ylimits, zlimits); estimatePathHandle = []; truthPathHandle = []; % Specify vehicle dimensions centerToFront = 1.104; centerToRear = 1.343; frontOverhang = 0.828; rearOverhang = 0.589; vehicleWidth = 1.653; vehicleHeight = 1.513; vehicleLength = centerToFront + centerToRear + frontOverhang + rearOverhang; hatchbackDims = vehicleDimensions(vehicleLength,vehicleWidth,vehicleHeight, ...'FrontOverhang',frontOverhang,'RearOverhang',rearOverhang); vehicleDims = [hatchbackDims.Length, hatchbackDims.Width, hatchbackDims.Height]; vehicleColor = [0.85 0.325 0.098]; % Initialize parameters skipFrames = 5; % Number of frames to skip to accumulate sufficient motion prevViewId = viewId; prevPtCloud = ptCloud; % Loop over lidar sensor frames and localizefor viewId = 6 : skipFrames : numel(ptCloudArr) % Process frame ptCloud = helperProcessPointCloud(ptCloudArr(viewId)); % Register current frame to previous frame relPose = pcregistericp(ptCloud, prevPtCloud, 'MaxIterations', 40, ...'Metric', 'pointToPlane'); % Since motion is restricted to a 2-D plane, discard motion along Z to% prevent accumulation of noise. relPose.Translation(3) = 0; % Update absolute pose height = absPose.Translation(3); absPose = rigid3d( relPose.T * absPose.T ); absPose.Translation(3) = height; % Add new view and connection to previous view vSet = addView(vSet, viewId, absPose, 'PointCloud', ptCloud); vSet = addConnection(vSet, prevViewId, viewId, relPose); % Accumulated point cloud map ptCloudAccum = pccat([ptCloudAccum, pctransform(ptCloud, absPose)]); % Compute ground truth and estimate position localizationEstimatePos = absPose.Translation; localizationTruthPos = lidarLocation(viewId, :); % Update accumulated point cloud map view(player, ptCloudAccum); % Set viewing angle to top view view(player.Axes, 2); % Convert current absolute pose of sensor to vehicle frame absVehiclePose = rigid3d( lidarToVehicleTform.T * absPose.T ); % Draw vehicle at current absolute pose helperDrawVehicle(player.Axes, absVehiclePose, vehicleDims, 'Color', vehicleColor); % Draw localization estimate and ground truth points helperDrawLocalization(player.Axes, ... localizationEstimatePos, estimatePathHandle, vehicleColor, ... localizationTruthPos, truthPathHandle, [0 1 0]); prevPtCloud = ptCloud; prevViewId = viewId; end

Zoom in to the tail of the trajectory to examine the localization estimate compared to the ground truth.

xlim(player.Axes, [0 15]); ylim(player.Axes, [-15 0]); zlim(player.Axes, [0 15]); snapnow; % Close player hide(player);

A useful outcome of a localization algorithm based on point cloud registration is a map of the traversed environment. You can obtain this map by combining all the point clouds to a common reference frame. The function is used in each iteration of the loop above, along with , to incrementally combine the registered point clouds. Alternatively, you can use the function to align all point clouds to the common reference frame in one shot at the end.

Superimpose the point cloud map on the top-view image of the scene to visually examine how closely it resembles features in the scene.

hMapOnScene = helperSuperimposeMapOnSceneImage(sceneName, ptCloudAccum); snapnow; % Close the figure close(hMapOnScene);

The localization algorithm described above is encapsulated in the helper class. This class can be used as a framework to develop a localization pipeline using point cloud registration.

  • Use the and name-value pair arguments to configure how point clouds are processed prior to registration.

  • Use the and name-value pair arguments to configure how point clouds are registered.

Evaluate Localization Accuracy

To quantify the efficacy of localization, measure the deviation in translation and rotation estimates compared to ground truth. Since the vehicle is moving on flat ground, this example is concerned only with motion in the X-Y plane.

hFigMetrics = helperDisplayMetrics(vSet, lidarLocation, lidarOrientation, simTimes);

Simulate in the Loop

Although metrics like deviation in translation and rotation estimates are necessary, the performance of a localization system can have downstream impacts. For example, changes to the accuracy or performance of a localization system can affect the vehicle controller, necessitating the retuning of controller gains. Therefore, it is crucial to have a closed-loop verification framework that incorporates downstream components. The model demonstrates this framework by incorporating a localization algorithm, vehicle controller and suitable vehicle model.

The model has these main components:

  • The Localize block is a MATLAB Function block that encapsulates the localization algorithm - implemented using the class. This block takes the lidar point cloud generated by the Simulation 3D Lidar block and the initial known pose as inputs and produces a localization estimate. The estimate is returned as $(x, y, \theta)$, which represents the 2-D pose of the lidar in the map reference frame.

  • The Plan subsystem loads a preplanned trajectory from the workspace using the , , and workspace variables. The Path Smoother Spline block was used to compute the , and variables. The Velocity Profiler block computed the variable.

  • The Helper Path Analyzer block uses the reference trajectory and the current pose to feed the appropriate reference signal to the vehicle controller.

  • The Vehicle Controller subsystem controls the steering and velocity of the vehicle by using a lateral and longitudinal controller to produce a steering and acceleration or deceleration command. The Lateral Controller Stanley and Longitudinal Controller Stanley blocks are used to implement this. These commands are fed to a vehicle model to simulate the dynamics of the vehicle in the simulation environment using the Vehicle Body 3DOF block.

close(hFigMetrics); % Load workspace variables for preplanned trajectory refPoses = data.ReferencePathForward.Trajectory.refPoses; directions = data.ReferencePathForward.Trajectory.directions; curvatures = data.ReferencePathForward.Trajectory.curvatures; velocities = data.ReferencePathForward.Trajectory.velocities; startPose = refPoses(1, :); % Open model modelName = 'localizeAndControlUsingLidar'; open_system(modelName); snapnow; % Run simulation sim(modelName); close_system(modelName);

With this setup, it is possible to rapidly iterate over different scenarios, sensor configurations, or reference trajectories and refine the localization algorithm before moving to real-world testing.

  • To select a different scenario, use the Simulation 3D Scene Configuration block. Choose from the existing prebuilt scenes or create a custom scene in the Unreal® Editor.

  • To create a different reference trajectory, use the tool, as shown in the Select Waypoints for Unreal Engine Simulation example.

  • To alter the sensor configuration use the Simulation 3D Lidar block. The Mounting tab provides options for specifying different sensor mounting placements. The Parameters tab provides options for modifying sensor parameters such as detection range, field of view, and resolution.

Supporting Functions

Extract an array of objects that contain lidar sensor data.

function ptCloudArr = helperGetPointClouds(simOut) % Extract signal ptCloudData = simOut.ptCloudData.signals.values; % Create a pointCloud array ptCloudArr = pointCloud(ptCloudData(:,:,:,3)); % Ignore first 2 framesfor n = 4 : size(ptCloudData,4) ptCloudArr(end+1) = pointCloud(ptCloudData(:,:,:,n)); %#ok<AGROW>endend

Extract ground truth location and orientation.

function [lidarLocation, lidarOrientation, simTimes] = helperGetLocalizerGroundTruth(simOut) lidarLocation = squeeze(simOut.lidarLocation.signals.values)'; lidarOrientation = squeeze(simOut.lidarOrientation.signals.values)'; simTimes = simOut.lidarLocation.time; % Ignore first 2 frames lidarLocation(1:2, :) = []; lidarOrientation(1:2, :) = []; simTimes(1:2, :) = []; end

Draw localization estimate and ground truth on axes.

function [estHandle,truthHandle] = helperDrawLocalization(axesHandle, ... est, estHandle, estColor, truth, truthHandle, truthColor) % Create scatter objects and adjust legendif isempty(estHandle) || isempty(truthHandle) markerSize = 6; hold(axesHandle, 'on'); estHandle = scatter3(axesHandle, NaN, NaN, NaN, markerSize, estColor, 'filled'); truthHandle = scatter3(axesHandle, NaN, NaN, NaN, markerSize, truthColor, 'filled'); %legend(axesHandle, {'Points', 'Estimate', 'Truth'}, ...% 'Color', [1 1 1], 'Location', 'northeast'); hold(axesHandle, 'off'); end estHandle.XData(end+1) = est(1); estHandle.YData(end+1) = est(2); estHandle.ZData(end+1) = est(3); truthHandle.XData(end+1) = truth(1); truthHandle.YData(end+1) = truth(2); truthHandle.ZData(end+1) = truth(3); end

Superimpose point cloud map on scene image

function hFig = helperSuperimposeMapOnSceneImage(sceneName, ptCloudAccum) hFig = figure('Name', 'Point Cloud Map'); hIm = helperShowSceneImage(sceneName); hold(hIm.Parent, 'on') pcshow(ptCloudAccum); hold(hIm.Parent, 'off') xlim(hIm.Parent, [-10 50]); ylim(hIm.Parent, [-30 20]); end

Display metrics to assess quality of localization.

function hFig = helperDisplayMetrics(vSet, lidarLocation, lidarOrientation, simTimes) absPoses = vSet.Views.AbsolutePose; translationEstimates = vertcat(absPoses.Translation); rotationEstimates = pagetranspose(cat(3, absPoses.Rotation)); viewIds = vSet.Views.ViewId; viewTimes = simTimes(viewIds); xEst = translationEstimates(:, 1); yEst = translationEstimates(:, 2); yawEst = euler(quaternion(rotationEstimates, 'rotmat', 'point'), 'ZYX', 'point'); yawEst = yawEst(:, 1); xTruth = lidarLocation(viewIds, 1); yTruth = lidarLocation(viewIds, 2); yawTruth = lidarOrientation(viewIds, 3); xDeviation = abs(xEst - xTruth); yDeviation = abs(yEst - yTruth); yawDeviation = abs(helperWrapToPi(yawTruth - yawEst)); hFig = figure('Name', 'Metrics - Absolute Deviation'); subplot(3,1,1) plot(viewTimes, xDeviation, 'LineWidth', 2); ylim([0 1]) grid on title('X') xlabel('Time (s)') ylabel('Deviation (m)') subplot(3,1,2) plot(viewTimes, yDeviation, 'LineWidth', 2); ylim([0 1]) grid on title('Y') xlabel('Time (s)') ylabel('Deviation (m)') subplot(3,1,3) plot(viewTimes, yawDeviation, 'LineWidth', 2); ylim([0 pi/20]) grid on title('Yaw') xlabel('Time (s)') ylabel('Deviation (rad)') end

Wrap angles to be in the range $[-\pi, \pi]$.

function angle = helperWrapToPi(angle) idx = (angle < -pi) | (angle > pi); angle(idx) = helperWrapTo2Pi(angle(idx) + pi) - pi; end

Wrap angles to be in the range $[-2\pi, 2\pi]$.

function angle = helperWrapTo2Pi(angle) pos = (angle>0); angle = mod(angle, 2*pi); angle(angle==0 & pos) = 2*pi; end

See Also

Functions

Blocks

Related Topics

You have a modified version of this example. Do you want to open this example with your edits?

You clicked a link that corresponds to this MATLAB command:

Run the command by entering it in the MATLAB Command Window. Web browsers do not support MATLAB commands.

Choose a web site to get translated content where available and see local events and offers. Based on your location, we recommend that you select: .

Select web site

You can also select a web site from the following list:

How to Get Best Site Performance

Select the China site (in Chinese or English) for best site performance. Other MathWorks country sites are not optimized for visits from your location.

Contact your local office

Sours: https://www.mathworks.com/help/driving/ug/lidar-localization-with-unreal-engine-simulation.html

carla-simulator / carla Public

@dirtshell For the record, we do have rolling shutter effect, but it is very slow. You have to run CARLA in sort of slow motion mode to be able to see it. See e.g. in our video https://youtu.be/S_XbKHC_hN0?t=18s, you do so by running CARLA at fixed time-step with a very high FPS

The higher the frame rate, the slower the simulation will run, and the smaller will be the angle the shutter moves each frame. The problem with the current implementation is that for each Lidar update there is a render update too and that's not very optimal, that's why we want to update the Lidar on each physics sub-steps.

We do plan in the future to add another Lidar implementation as well, using the depth buffer at shader level. This should be faster.

We don't have a plan of action for these two issues yet, if think you can contribute it would be great :)

Sours: https://github.com/carla-simulator/carla/issues/266
  1. Columbia lawn tractor review
  2. Mad max vw beetle
  3. Kentucky firearms club

Creating 3D Virtual Driving Environments for Simulation-Aided Development of Autonomous Driving and Active Safety 2017-01-0107

2017-03-28
Recreating traffic scenarios for testing autonomous driving in the real world requires significant time, resources and expense, and can present a safety risk if hazardous scenarios are tested. Using a 3D virtual environment to enable testing of many of these traffic scenarios on the desktop or cluster significantly reduces the amount of required road tests. In order to facilitate the development of perception and control algorithms for level 4 autonomy, a shared memory interface between MATLAB, Simulink, and Unreal Engine 4 can send information (such as vehicle control signals) back to the virtual environment. The shared memory interface conveys arbitrary numerical data, RGB image data, and point cloud data for the simulation of LiDAR sensors. The interface consists of a plugin for Unreal Engine, which contains the necessary read/write functions, and a beta toolbox for MATLAB, capable of reading and writing from the same shared memory locations specified in Unreal Engine, MATLAB, and Simulink. The LiDAR sensor model was tested by generating point clouds with beam patterns that mimic Velodyne HDL-32E (32 beam) sensors and is demonstrated to run at sufficient frame rates for real-time computations by leveraging the Graphics Processing Unit (GPU).

SAE MOBILUS

Subscribers can view annotate, and download all of SAE's content. Learn More »

Access SAE MOBILUS »

We also recommend:
JOURNAL ARTICLE

Reliable Infrastructural Urban Traffic Monitoring Via Lidar and Camera Fusion

2017-01-0083

View Details

TECHNICAL PAPER

Aspect of Simulation and Experimental Research Studies on Wheeled Armored Fighting Vehicles with Hydropneumatic Suspension

2010-01-0651

View Details

JOURNAL ARTICLE

Friction Coefficient Mapping during Brake Interventions

2014-01-2496

View Details

Sours: https://www.sae.org

Lidar message structure seems wrong in Unreal Engine 4 simulation (with carter_sim)

Hello everyone

I used the carter_sim application in the Unreal Engine 4 simulation
and i get
errors: the structure of the Lidar message seems to be incorrect
The Carter Sim application connects to Isaac Sim via TCP and instantiates the robot with multiple sensors in the simulator, in particular a lidar sensor.

This is what I did:
Instantiate the robot with its sensors with the following command:
bazel run //apps/carter/carter_sim:carter_sim – --config=“apps/assets/maps/carter_warehouse_p.config.json” --graph=“apps/assets/maps/carter_warehouse_p.graph.json”

and
I launch Isaac Sim Editor with the command:
./Engine/Binaries/Linux/UE4Editor IsaacSimProject CarterWarehouse_P -vulkan -isaac_sim_config_json="/home/jetskins/isaac/apps/carter/carter_sim/bridge_config/carter_full.json"

Everything seems fine but when I play the simulation, I get:


However, port 5000 is opened by Unreal simulation and netcat receives lidar messages.

The structure of the Lidar message appears to be incorrect.
When I delete the lidar in the configuration file, I can see the camera messages in Websight. It therefore seems that it is the lidar structure which is incorrect and which ends the program.
Thanks for your help!

Sours: https://forums.developer.nvidia.com/t/lidar-message-structure-seems-wrong-in-unreal-engine-4-simulation-with-carter-sim/125445

Lidar simulation engine unreal

Design Lidar SLAM Algorithm Using Unreal Engine Simulation Environment

Introduction

Automated Driving Toolbox™ integrates an Unreal Engine simulation environment in Simulink®. Simulink blocks related to this simulation environment can be found in the library. These blocks provide the ability to:

  • Select different scenes in the 3D simulation environment

  • Place and move vehicles in the scene

  • Attach and configure sensors on the vehicles

  • Simulate sensor data based on the environment around the vehicle

This powerful simulation tool can be used to supplement real data when developing, testing, and verifying the performance of automated driving algorithms, making it possible to test scenarios that are difficult to reproduce in the real world.

In this example, you evaluate a lidar perception algorithm using synthetic lidar data generated from the simulation environment. The example walks you through the following steps:

  • Record and visualize synthetic lidar sensor data from the simulation environment.

  • Develop a perception algorithm to build a map using SLAM in MATLAB®.

Set Up Scenario in Simulation Environment

First, set up a scenario in the simulation environment that can be used to test the perception algorithm. Use a scene depicting a typical city block with a single vehicle that is the vehicle under test. You can use this scene to test the performance of the algorithm in an urban road setting.

Next, select a trajectory for the vehicle to follow in the scene. The Select Waypoints for Unreal Engine Simulation example describes how to interactively select a sequence of waypoints from a scene and generate a vehicle trajectory. This example uses a recorded drive segment obtained using the function, as described in the waypoint selection example.

% Load reference path for recorded drive segment xData = load('refPosesX.mat'); yData = load('refPosesY.mat'); yawData = load('refPosesT.mat'); % Set up workspace variables used by model refPosesX = xData.refPosesX; refPosesY = yData.refPosesY; refPosesT = yawData.refPosesT; % Display path on scene image sceneName = 'USCityBlock'; hScene = figure; helperShowSceneImage(sceneName); hold on scatter(refPosesX(:,2), refPosesY(:,2), 7, 'filled') % Adjust axes limits xlim([-150 100]) ylim([-125 75])

The Simulink model is configured with the US City Block scene using the Simulation 3D Scene Configuration block. The model places a vehicle on the scene using the Simulation 3D Vehicle with Ground Following block. A lidar sensor is attached to the vehicle using the Simulation 3D Lidar block. In the block dialog box, use the Mounting tab to adjust the placement of the sensor. Use the Parameters tab to configure properties of the sensor to simulate different lidar sensors. In this example, the lidar is mounted on the center of the roof. The lidar sensor is configured to model a typical Velodyne® HDL-32E sensor.

close(hScene) if ~ispc error(['3D Simulation is only supported on Microsoft', char(174), ' Windows', char(174), '.']); end% Open the model modelName = 'LidarSLAMIn3DSimulation'; open_system(modelName); snapnow;

The model records and visualizes the synthetic lidar data. The recorded data is available through the simulation output, and can be used for prototyping your algorithm in MATLAB. Additionally, the model uses a From Workspace (Simulink) block to load simulated measurements from an Inertial Navigation Sensor (INS). The INS data was obtained by using an object, and saved in a MAT file.

The rest of the example follows these steps:

  1. Simulate the model to record synthetic lidar data generated by the sensor and save it to the workspace.

  2. Use the sensor data saved to the workspace to develop a perception algorithm in MATLAB. The perception algorithm builds a map of the surroundings using SLAM.

  3. Visualize the results of the built map.

Record and Visualize Synthetic Lidar Sensor Data

The Record and Visualize subsystem records the synthetic lidar data to the workspace using a To Workspace (Simulink) block. The Visualize Point Cloud MATLAB Function block uses a object to visualize the streaming point clouds. The Visualize INS Path MATLAB Function block visualizes the streaming INS data.

Simulate the model. The streaming point cloud display shows the synthetic lidar sensor data. The scene display shows the synthetic INS sensor data. Once the model has completed simulation, the variable holds a structure with variables written to the workspace. The function extracts the sensor data into an array of objects. The object is the fundamental data structure used to hold lidar data and perform point cloud processing in MATLAB. Additionally, INS data is loaded from a MAT file, which will later be used to develop the perception algorithm. The INS data was obtained using the object. The INS data has been processed to contain [x, y, theta] poses in world coordinates.

% Update simulation stop time to end when reference path is completed simStopTime = refPosesX(end,1); set_param(gcs, 'StopTime', num2str(simStopTime)); % Load INS data from MAT file data = load('insMeasurement.mat'); insData = data.insMeasurement.signals.values; % Run the simulation simOut = sim(modelName); % Create a pointCloud array from the recorded data ptCloudArr = helperGetPointCloud(simOut);

Use Recorded Data to Develop Perception Algorithm

The synthetic lidar sensor data can be used to develop, experiment with, and verify a perception algorithm in different scenarios. This example uses an algorithm to build a 3D map of the environment from streaming lidar data. Such an algorithm is a building block for applications like localization. It can also be used to create high-definition (HD) maps for geographic regions that can then be used for online localization. The map building algorithm is encapsulated in the class. This class uses point cloud and lidar processing capabilities in MATLAB. For more details, see Point Cloud Processing.

The class takes incoming point clouds from a lidar sensor and progressively builds a map using the following steps:

  1. Preprocess point cloud: Preprocess each incoming point cloud to remove the ground plane and ego vehicle.

  2. Register point clouds: Register the incoming point cloud with the last point cloud using a normal distribution transform (NDT) registration algorithm. The function performs the registration. To improve accuracy and efficiency of registration, is used to downsample the point cloud prior to registration. An initial transform estimate can substantially improve registration performance. In this example, INS measurements are used to accomplish this.

  3. Register point clouds: Use the estimated transformation obtained from registration to transform the incoming point cloud to the frame of reference of the map.

  4. Update view set: Add the incoming point cloud and the estimated absolute pose as a view in a object. Add a connection between the current and previous view with the relative transformation between them.

The method of the class accomplishes these steps. The function computes an initial estimate for registration from simulated INS sensor readings.

Such an algorithm is susceptible to drift while accumulating a map over long sequences. To reduce the drift, it is typical to detect loop closures and use graph SLAM to correct the drift. See Build a Map from Lidar Data Using SLAM example for a detailed treatment. The method of the class configures loop closure detection. Once it is configured, loop closure detection takes place each time is invoked, using the following functions and classes:

  • : Manages data associated with point cloud odometry like point clouds, poses and connections.

  • : Extracts scan context descriptors from each incoming point cloud. Scan context is a 2-D global feature descriptor that is used for loop closure detection.

  • : Manages scan context descriptors and detects loop closures. It uses to compute the distance between scan context descriptors and select the closest feature matches.

Then, the example uses point cloud registration to accept or reject loop closure candidates and to find the loop closure transformation.

% Set the random seed for example reproducibility rng(0); % Create a lidar map builder mapBuilder = helperLidarMapBuilder('DownsamplePercent', 0.25, ...'RegistrationGridStep', 3.5, 'Verbose', true); % Configure the map builder to detect loop closures configureLoopDetector(mapBuilder, ...'LoopConfirmationRMSE', 2, ...'SearchRadius', 0.15, ...'DistanceThreshold', 0.15); % Loop through the point cloud array and progressively build a map skipFrames = 5; numFrames = numel(ptCloudArr); exitLoop = false; prevInsMeas = insData(1, :); for n = 1 : skipFrames : numFrames insMeas = insData(n, :); % Estimate initial transformation using INS initTform = helperEstimateRelativeTransformationFromINS(insMeas, prevInsMeas); % Update map with new lidar frame updateMap(mapBuilder, ptCloudArr(n), initTform); % Update top-view display isDisplayOpen = updateDisplay(mapBuilder, exitLoop); % Check and exit if needed exitLoop = ~isDisplayOpen; prevInsMeas = insMeas; end snapnow; % Close display closeDisplay = true; updateDisplay(mapBuilder, closeDisplay);
Loop closure candidate found between view Id 211 and 4 with RMSE 1.019180... Accepted

The accumulated drift progressively increases over time resulting in an unusable map.

Once sufficient loop closures are detected, the accumulated drift can be corrected using pose graph optimization. This is accomplished by the method of the class, which uses to create a pose graph and (Navigation Toolbox) to optimize the pose graph.

After the pose graph has been optimized, rebuild the map using the updated poses. This is accomplished by the method of using .

Use and to correct for the drift and rebuild the map. Visualize the view set before and after pose graph optimization.

% Visualize viewset before pose graph optimization hFigViewset = figure; hG = plot(mapBuilder.ViewSet); view(hG.Parent, 2); title('Viewset Display') % Optimize pose graph and rebuild map optimizeMapPoses(mapBuilder); rebuildMap(mapBuilder); % Overlay viewset after pose graph optimization hold(hG.Parent, 'on'); plot(mapBuilder.ViewSet); hold(hG.Parent, 'off'); legend(hG.Parent, 'before', 'after')
Optimizing pose graph...done Rebuilding map...done

Visualize the accumulated point cloud map computed using the recorded data.

close(hFigViewset) hFigMap = figure; pcshow(mapBuilder.Map) % Customize axes labels and title xlabel('X (m)') ylabel('Y (m)') zlabel('Z (m)') title('Point Cloud Map') helperMakeFigurePublishFriendly(hFigMap);

By changing the scene, placing more vehicles in the scene, or updating the sensor mounting and parameters, the perception algorithm can be stress-tested under different scenarios. This approach can be used to increase coverage for scenarios that are difficult to reproduce in the real world.

% Close windows close(hFigMap) close_system(modelName)

Supporting Functions

helperGetPointCloud Extract an array of objects.

function ptCloudArr = helperGetPointCloud(simOut) % Extract signal ptCloudData = simOut.ptCloudData.signals.values; % Create a pointCloud array ptCloudArr = pointCloud(ptCloudData(:,:,:,1)); for n = 2 : size(ptCloudData,4) ptCloudArr(end+1) = pointCloud(ptCloudData(:,:,:,n)); %#ok<AGROW>endend

helperMakeFigurePublishFriendly Adjust figure so that screenshot captured by publish is correct.

function helperMakeFigurePublishFriendly(hFig) if ~isempty(hFig) && isvalid(hFig) hFig.HandleVisibility = 'callback'; endend

Additional supporting functions or classes used in the example are included below.

helperLidarMapBuilder progressively builds a lidar map using point cloud scans. Each point cloud is processed to remove the ground plane and the ego vehicle, and registered against the previous point cloud. A point cloud map is then progressively built by aligning and merging the point clouds.

helperEstimateRelativeTransformationFromINS estimates a relative transformation from INS data.

helperShowSceneImage displays top-view image of the Unreal scene.

helperUpdatePolyline updates a polyline position used in conjunction with helperShowSceneImage.

See Also

Functions

Objects

Blocks

Related Topics

Sours: https://www.mathworks.com/help/driving/ug/design-lidar-slam-algorithm-using-3d-simulation-environment.html
UE4 LIDAR Post Process material demo

I had no idea at all. how I will now go into the house and pretend that I do not know anything. And suddenly I remembered that Ivan was about to appear, and he could take this sweet couple by surprise. And another thought came to me too. We need to unwind.

Similar news:

Then, dear, take me. - and lay across the bed and spread her dazzling white legs, showing a wet slit. I, in total eclipse, quickly threw off my pants and panties, took out my penis and sent it to the lascivious pussy. It was tight, hot and sweet. I began to move slowly, then faster, faster.



538 539 540 541 542