Lab 8: SQL*Loader and External Tables


 

Objectives

After completing this lab you should be able to:

Lab Assignment Steps

  1. Create a table of stock prices with columns and data types as shown below. Make the symbol and close_date columns the primary key.
  2.  SQL> desc stock_price
     Name                            Null?    Type
     ------------------------------- -------- ----
     SYMBOL                          NOT NULL VARCHAR2(5)
     CLOSE_DATE                      NOT NULL DATE
     SEQ_NO                                   NUMBER
     OPEN                                     NUMBER(8,4)
     HI                                       NUMBER(8,4)
     LO                                       NUMBER(8,4)
     CLOSE                                    NUMBER(8,4)
     VOLUME                                   NUMBER(12)
    
  3. Next, you will need to bulk load data into this table. You do this with an Oracle database utility called SQL*Loader. Your control file will look like this:
    options (direct=false, rows=50000)
    load data
    infile 'stocks.csv'
    append
    discardmax 10
    into table stock_price
    fields terminated by ','
    trailing nullcols
    (symbol, close_date date "YYYYMMDD", open, hi, lo, close, volume)
    
    You will create this control file with a text editor. This is not a SQL file so don't create this in sqlplus, don't run it in sqlplus.


  4. Then to load the data, you use this command from the UNIX command line:
    sqlldr userid=user/password control=stocks.ctl
    You can simply type sqlldr at the command line to get help for this utility. Also, sqlldr is another alternate spelling for the command.

    The control files typically end with a .ctl extension. A log file will store the results of your load, its extension will be .log and will have the same name as the control file. Records that are not loaded are stored in a bad file, its extension will be .bad and will also have the same name as the control file.

    The datafile you need can be downloaded here --> stocks.csv You can also reference the data file in your .ctl file (in the infile part of the above) or copy it from this location (using the UNIX cp command) from in my directory: /home/droberts/public_html/n311/stocks.csv.


  5. Now you will create another table called STOCK_PRICE_EXTERNAL with these requirements:


  6. You will also need to the following to make the external table work when you select from it:

    1. Create your own directory to store the Log and Bad files. (even selecting from the external table generates the log files which require write permissions). So you will create a directory, map it to your own phoenix account with full OS permissions (chmod 777 yourdirectory).
    2. In your access parameters clause of the external table, you will need to specify where the bad and log files go. They will be located in your directory.
    3. example of how mine worked:
    create table droberts.stock_price_external (
    symbol varchar2(5),
    ...)
    organization external
    (type ORACLE_LOADER
    default directory N311_DIR
    access parameters (records delimited by newline
    badfile DALE_DIR:'extload.bad' 
    logfile DALE_DIR:'extload.log' 
    fields terminated by ','
    (symbol char(5),...
    
  7. Turn in the following: I will be checking that both of your table have the same data in them so be sure that your tables are created and loaded properly.