Main Content

Run Standalone MATLAB MapReduce Application

Supported Platform: Linux® only.

This example shows you how to create a standalone MATLAB® MapReduce application using the mcc command and run it against a Hadoop® cluster.

Goal: Calculate the maximum arrival delay of an airline from the given dataset.

Dataset:airlinesmall.csv
Description:

Airline departure and arrival information from 1987-2008.

Location:/usr/local/MATLAB/R2023b/toolbox/matlab/demos

Prerequisites

  1. Start this example by creating a new work folder that is visible to the MATLAB search path.

  2. Before starting MATLAB, at a terminal, set the environment variable HADOOP_PREFIX to point to the Hadoop installation folder. For example:

    ShellCommand
    csh / tcsh

    % setenv HADOOP_PREFIX /usr/lib/hadoop

    bash

    $ export HADOOP_PREFIX=/usr/lib/hadoop

    Note

    This example uses /usr/lib/hadoop as directory where Hadoop is installed. Your Hadoop installation directory maybe different.

    If you forget setting the HADOOP_PREFIX environment variable prior to starting MATLAB, set it up using the MATLAB function setenv at the MATLAB command prompt as soon as you start MATLAB. For example:

    setenv('HADOOP_PREFIX','/usr/lib/hadoop')

  3. Install the MATLAB Runtime in a folder that is accessible by every worker node in the Hadoop cluster. This example uses /usr/local/MATLAB/MATLAB_Runtime/R2023b as the location of the MATLAB Runtime folder.

    If you don’t have the MATLAB Runtime, you can download it from the website at: /products/compiler/mcr.

    Note

    For information about MATLAB Runtime version numbers corresponding MATLAB releases, see this list.

  4. Copy the map function maxArrivalDelayMapper.m from /usr/local/MATLAB/R2023b/toolbox/matlab/demos folder to the work folder.

     maxArrivalDelayMapper.m

    For more information, see Write a Map Function.

  5. Copy the reduce function maxArrivalDelayReducer.m from matlabroot/toolbox/matlab/demos folder to the work folder.

     maxArrivalDelayReducer.m

    For more information, see Write a Reduce Function.

  6. Create the directory /user/<username>/datasets on HDFS™ and copy the file airlinesmall.csv to that directory. Here <username> refers to your user name in HDFS.

    $ ./hadoop fs -copyFromLocal airlinesmall.csv hdfs://host:54310/user/<username>/datasets

Procedure

  1. Start MATLAB and verify that the HADOOP_PREFIX environment variable has been set. At the command prompt, type:

    >> getenv('HADOOP_PREFIX')

    If ans is empty, review the Prerequisites section above to see how you can set the HADOOP_PREFIX environment variable.

  2. Create a new MATLAB script with the name depMapRedStandAlone.m. You will add the code listed in the steps listed below to this script file.

  3. Create a datastore that points to the airline data in Hadoop Distributed File System (HDFS).

    ds = datastore('hdfs:///user/username/datasets/airlinesmall.csv',...
    'TreatAsMissing','NA',...
    'SelectedVariableNames',{'UniqueCarrier','ArrDelay'});

    For more information, see Work with Remote Data.

  4. Configure the application for deployment against Hadoop with default settings.

    config = matlab.mapreduce.DeployHadoopMapReducer;

    The class matlab.mapreduce.DeployHadoopMapReducer can be used to configure a standalone application based on the Hadoop environment where it is going to be deployed.

    For example, if you want to specify the location of the MATLAB Runtime on each of the worker nodes on the cluster, include a line of code similar to this:

    config = matlab.mapreduce.DeployHadoopMapReducer('MCRRoot','/opt/MATLAB/MATLAB_Runtime/R2023b');
    In this scenario, we assume that the MATLAB Runtime is installed in a non-default location such as /opt/MATLAB/MATLAB_Runtime on the worker nodes.

    For information on specifying additional cluster specific properties, see matlab.mapreduce.DeployHadoopMapReducer.

    Note

    Specifying a MATLAB Runtime location as part of the class matlab.mapreduce.DeployHadoopMapReducer will override any MATLAB Runtime location specified during the execution of the standalone application.

  5. Define the execution environment using the mapreducer.

    mr = mapreducer(config);
  6. Apply the mapreduce function.

    result = mapreduce(...
        ds,...
        @maxArrivalDelayMapper,@maxArrivalDelayReducer,...
        mr,...
        'OutputType','Binary', ...
        'OutputFolder','hdfs:///user/<username>/results/myresults');

    Note

    An HDFS directory such as .../myresults can be written to only once. If you plan on running your standalone application multiple times against the Hadoop cluster, make sure you delete the .../myresults directory on HDFS prior to each execution. Another option is to change the name of the .../myresults directory in the MATLAB code and recompile the application.

  7. Read the result from the resulting datastore.

    myAppResult = readall(result)
  8. Use the mcc command with the -m flag to create a standalone application.

    mcc -m depMapRedStandAlone.m

    The -m flag creates a standard executable that can be run from a command line. However, the mcc command cannot package the results in an installer.

  9. Run the standalone application from a Linux shell using the following command:

    $ ./run_depMapRedStandAlone.sh /usr/local/MATLAB/MATLAB_Runtime/R2023b

    /usr/local/MATLAB/MATLAB_Runtime/R2023b is an argument indicating the location of the MATLAB Runtime.

    Prior to executing the above command, verify that the HADOOP_PREFIX environment variable is set in the Terminal by typing:

    $ echo $HADOOP_PREFIX
    If echo comes up empty, see the Prerequisites section above to see how you can set the HADOOP_PREFIX environment variable.

    Your application will fail to execute if the HADOOP_PREFIX environment variable is not set.

  10. You will see the following output:

    myAppResult = 
    
               Key           Value 
        _________________    ______
    
        'MaxArrivalDelay'    [1014]

To learn more about using the map and reduce functions, see Getting Started with MapReduce.

Complete code for the standalone application depMapRedStandAlone can be found here:

 depMapRedStandAlone.m

See Also

| | | |

Related Topics