Uploading CSV files into dynamic ADF Faces tables – programmtically creating columns

Lucas Jellema 4
0 0
Read Time:9 Minute, 41 Second

In a recent article – ADF Faces File Uploading – It is really that simple! – I described how to set up file uploading with ADF Faces. Using the inputFile component, it is dead easy to develop a JSF page in which the user can upload a file and that then displays some file properties.

As a next step, I will look into the processing of CSV (comma separated values) files. In this article I will show how it is almost as easy to turn a CSV file into an ADF Faces table: columns are based on the comma separated values, table rows correspond with the records in the file, the first row can be used for column headers and each column is sortable:

When the checkbox Use first row values as Headers is checked:

When we click on the Content column, the records are sorted:....

 

The JSF page itself is very similar to the page we discussed in the previous article. The main differences are the checkbox and the table component. However, you will rarely have seen a less interesting af:table: all of the actual table content is programmatically set, so you see no columns at all in the jspx file:

        <af:form usesUpload="true">
<af:panelPage title="Upload and Process CSV File">
<af:objectSpacer width="10" height="25"/>
<af:inputFile label="File to Upload" columns="90"
value="#{FileProcessor.uploadedFile}"/>
<af:objectSpacer width="10" height="15"/>
<af:commandButton text="Start Upload"/>
<af:objectSpacer width="10" height="25"/>
<af:objectSeparator/>
<af:panelBox>
<af:panelForm>
<af:inputText label="File Name"
value="#{FileProcessor.filename}"
readOnly="true"/>
<af:inputText label="File Size"
value="#{FileProcessor.filesize}"
readOnly="true"/>
<af:inputText label="File Type"
value="#{FileProcessor.filetype}"
readOnly="true"/>
<af:selectBooleanCheckbox id="useFirstAsHeader"
label="Use first row values as Headers"
autoSubmit="true"
value="#{CsvProcessor.useFirstRowAsHeaders}"/>
</af:panelForm>
</af:panelBox>
<af:objectSpacer width="10" height="25"/>
<af:table value="#{CsvProcessor.rows}" var="row"
varStatus="rowStatus" binding="#{CsvProcessor.table}"
partialTriggers="useFirstAsHeader">
<f:facet name="header">
<af:outputText value="Contents Extracted from CSV file"/>
</f:facet>
</af:table>
</af:panelPage>
</af:form>
 

The FileProcessor bean that is referenced from the inputFile component is configured in the faces-config.xml file:

    <managed-bean>
<managed-bean-name>FileProcessor</managed-bean-name>
<managed-bean-class>nl.amis.adffaces.files.FileProcessor</managed-bean-class>
<managed-bean-scope>request</managed-bean-scope>
<managed-property>
<property-name>tablecreator</property-name>
<property-class>nl.amis.adffaces.files.CSVtoADFTableProcessor</property-class>
<value>#{CsvProcessor}</value>
</managed-property>
</managed-bean>
<managed-bean>
<managed-bean-name>CsvProcessor</managed-bean-name>
<managed-bean-class>nl.amis.adffaces.files.CSVtoADFTableProcessor</managed-bean-class>
<managed-bean-scope>session</managed-bean-scope>
</managed-bean>
 

It has a managed property tablecreator that gets injected. This property referes to an instance of class CSVtoADFTableProcessor, a class that takes the file contents and turns it into columns and rows for display in an ADF Faces table component. Let’s first look at the FileProcessor:

package nl.amis.adffaces.files;

import java.io.IOException;
import oracle.adf.view.faces.model.UploadedFile;

public class FileProcessor {

private CSVtoADFTableProcessor tablecreator;
private UploadedFile uploadedFile;
private String filename;
private long filesize;
private String filecontents;
private String filetype;


public FileProcessor() {
}

public void setUploadedFile(UploadedFile uploadedFile) {
this.uploadedFile = uploadedFile;
this.filename = uploadedFile.getFilename();
this.filesize = uploadedFile.getLength();
this.filetype = uploadedFile.getContentType();
try {
tablecreator.processCSV(uploadedFile.getInputStream());
} catch (IOException e) {
// TODO
}
}

public UploadedFile getUploadedFile() {
return uploadedFile;
}

public void setFilename(String filename) {
this.filename = filename;
}

public String getFilename() {
return filename;
}

public void setFilesize(long filesize) {
this.filesize = filesize;
}

public long getFilesize() {
return filesize;
}

public void setFilecontents(String filecontents) {
this.filecontents = filecontents;
}

public String getFilecontents() {
return filecontents;
}

public void setFiletype(String filetype) {
this.filetype = filetype;
}

public String getFiletype() {
return filetype;
}

public void setTablecreator(CSVtoADFTableProcessor tablecreator) {
this.tablecreator = tablecreator;
}

public CSVtoADFTableProcessor getTablecreator() {
return tablecreator;
}
}
 

Most of it is trivial bean properties with getters and setters, including the reference to the CSVtoADFTableProcess instance. The setUploadedFile() method is invoked whenever the user uploads a file. This method then sets the other FileProcessor bean properties and gives an opportunity to the tablecreator to step in, process the file contents and manipulate the table based on the contents.

The next class to discuss is of course the CSVtoADFTableProcessor. The binding attribute of the af:table component refers to this class instance. One of the things it does is take the uploaded file (InputStream) and parse it into rows and columns based on the CSV format (newlines to demarcate records, commas to delimit fields in records). For this generic parsing operation, I make use of one of the Ostermiller Java Utilities – the CSV Parser. To use this utility, I have downloaded the JAR (750Kb), copied it to the WEB-INF/lib directory of my project and set up the JAR as a project library in JDeveloper.

I can now leverage the CSV parsing capabilities of this utility with very little effort:

    public void processCSV(InputStream csvFile)
{
// Parse the data, using http://ostermiller.org/utils/download.html
String[][] csvvalues=null;
try {
csvvalues = CSVParser.parse(new InputStreamReader(csvFile));
} catch (IOException e) {
// TODO
}
....
 

The class as a whole is as follows:

package nl.amis.adffaces.files;

import com.Ostermiller.util.CSVParser;

import java.io.IOException;
import java.io.InputStream;
import java.io.InputStreamReader;

import java.util.ArrayList;
import java.util.HashMap;
import java.util.List;

import java.util.Map;

import javax.faces.application.Application;
import javax.faces.context.FacesContext;

import oracle.adf.view.faces.component.core.data.CoreColumn;
import oracle.adf.view.faces.component.core.data.CoreTable;
import oracle.adf.view.faces.component.core.output.CoreOutputText;

public class CSVtoADFTableProcessor {
private CoreTable table;
private List rows = new ArrayList();
private boolean useFirstRowAsHeaders = false;
private int numberOfColumns;

public CSVtoADFTableProcessor() {
}


public void processCSV(InputStream csvFile) {
// Parse the data, using http://ostermiller.org/utils/download.html
String[][] csvvalues = null;
try {
csvvalues = CSVParser.parse(new InputStreamReader(csvFile));
} catch (IOException e) {
// TODO
}
rows = new ArrayList();

numberOfColumns = 0;
for (int i = 0; i < csvvalues.length; i++) {
Map tablerow = new HashMap();
for (int j = 0; j < csvvalues[i].length; j++) {
if (j > numberOfColumns)
numberOfColumns = j;
tablerow.put("cell" + (j + 1), csvvalues[i][j]);
} // for cells
rows.add(tablerow);
} // for rows


setupTableColumns();

}

private void setupTableColumns() {
// take parsed data and create the columns for the ADF Faces table
// as well as the rows list of table backing data set
FacesContext fc = FacesContext.getCurrentInstance();
Application app = fc.getApplication();
table.getChildren().clear();
CoreColumn col =
(CoreColumn)app.createComponent(CoreColumn.COMPONENT_TYPE);
col.setId("rowheader");
CoreOutputText cell =
(CoreOutputText)app.createComponent(CoreOutputText.COMPONENT_TYPE);
cell.setId("rowcell");
cell.setValueBinding("value",
app.createValueBinding("#{rowStatus.index}"));

col.getChildren().add(cell);
col.setHeaderText("^");
table.getChildren().add(col);


for (int i = 0; i < numberOfColumns + 1; i++) {
col = (CoreColumn)app.createComponent(CoreColumn.COMPONENT_TYPE);
col.setId("col" + i);
cell =
(CoreOutputText)app.createComponent(CoreOutputText.COMPONENT_TYPE);
cell.setId("cell" + i);
cell.setValueBinding("value",
app.createValueBinding("#{row['cell" +
(i + 1) + "']}"));

col.getChildren().add(cell);
col.setValueBinding("headerText",
app.createValueBinding("#{CsvProcessor.columnHeaders['" +
i + "']}"));
col.setSortable(true);
col.setSortProperty("cell" + (i + 1));
table.getChildren().add(col);
}
}

public void setTable(CoreTable table) {
this.table = table;
}

public CoreTable getTable() {
return table;
}

public void setRows(List rows) {
this.rows = rows;
}

public List getRows() {
return rows.subList(useFirstRowAsHeaders ? 1 : 0, rows.size());
}

public Map getColumnHeaders() {
Map columnHeaders = new HashMap();
if (useFirstRowAsHeaders) {
for (int i = 0; i < numberOfColumns + 1; i++)
columnHeaders.put(Integer.toString(i),
((Map)rows.get(0)).get("cell" + (i + 1)));

} else
for (int i = 0; i < numberOfColumns + 1; i++)
columnHeaders.put(Integer.toString(i),
String.valueOf((char)('A' + i)));
return columnHeaders;
}

public void setUseFirstRowAsHeaders(boolean useFirstRowAsHeaders) {
this.useFirstRowAsHeaders = useFirstRowAsHeaders;
}

public boolean isUseFirstRowAsHeaders() {
return useFirstRowAsHeaders;
}
}

 

The interesting bits and pieces are first of all in processCSV (a method that could do with a little refactoring – I leave that as an exercise to the readerSmiley). It turns the file content into a multi dimensional array. Then it call setupTableColumns() to manipulate the table: it removes the current list of children from the table component. Next it creates the first column that will contain rownumbers;the value for this column is set with an EL Expression that refers to the rowStatus variable – see the af:table specification in the jspx table.

Then it creates a column for every field found in the file records. The column’s headerText is set using a value binding expression to an element in the getColumnHeaders map on this class. The column contents is also set using an EL Expression in a ValueBinding; it refers to a field in the row variable of type Map. This corresponds with the af:table set up in the JSPX page:

  <af:table value="#{CsvProcessor.rows}" var="row"
varStatus="rowStatus" binding="#{CsvProcessor.table}"
partialTriggers="useFirstAsHeader">

The row variable is available during table rendering – it contains the individual elements retrieved from the #{CsvProcessor.rows} list, the data source for the table. This rows collection is an ArrayList that is set up with HashMap elements. The

            cell.setValueBinding("value", 
app.createValueBinding("#{row['cell" +
(i + 1) + "']}"));
 

refers to the cell1, cell2, … celln keys in the HashMap.

Note that all columns thus created are made sortable by setting the sortable property and specifying the SortProperty – to cell1, cell2… celln.

The last interesting bit to discuss is the checkbox Use first row values as Headers:

                <af:selectBooleanCheckbox id="useFirstAsHeader"
label="Use first row values as Headers"
autoSubmit="true"
value="#{CsvProcessor.useFirstRowAsHeaders}"/>
 

it is bound to the useFirstRowAsHeaders bean proper

ty in the CsvProcessor bean. It has its autoSubmit attribute set to true, meaning that the page values are submitted in an AJAX (PPR) request as soon as the checkbox is toggled. The table includes the id of this checkbox in its partialTriggers attribute, indicating that it should be refreshed whenever the checkbox is toggled.

The effect of toggling the checkbox or toggling the useFirstRowAsHeaders boolean in the CSVtoADFTableProcessor class is that the next call to getColumnHeaders() returns a different result: with useFirstRowAsHeaders set to false, the columnHeaders map contains the letters of the alphabet. When set to true, it returns the values from the first row read from the CSV file. In that latter case, the getRows() method returns the rows collection, starting at the second record:

    public List getRows() {
return rows.subList(useFirstRowAsHeaders ? 1 : 0, rows.size());
}
 

Resources

Download JDeveloper 10.1.3.2 application: ProcessCVSFilesToADFTable.zip .

 

About Post Author

Lucas Jellema

Lucas Jellema, active in IT (and with Oracle) since 1994. Oracle ACE Director and Oracle Developer Champion. Solution architect and developer on diverse areas including SQL, JavaScript, Kubernetes & Docker, Machine Learning, Java, SOA and microservices, events in various shapes and forms and many other things. Author of the Oracle Press book Oracle SOA Suite 12c Handbook. Frequent presenter on user groups and community events and conferences such as JavaOne, Oracle Code, CodeOne, NLJUG JFall and Oracle OpenWorld.
Happy
Happy
0 %
Sad
Sad
0 %
Excited
Excited
0 %
Sleepy
Sleepy
0 %
Angry
Angry
0 %
Surprise
Surprise
0 %

Average Rating

5 Star
0%
4 Star
0%
3 Star
0%
2 Star
0%
1 Star
0%

4 thoughts on “Uploading CSV files into dynamic ADF Faces tables – programmtically creating columns

  1. Would someone have a runable version of this project/feature, preferably with JDeveloper 11g.
    I did try with 10.1.3.2 but I’m not sure I found the appropriate utility jar files, and it fails with both versions.

    thanks in advance

  2. Erro ocurred when clikcing check box:

    Aug 16, 2007 2:58:26 PM oracle.adfinternal.view.faces.renderkit.core.xhtml.PanelPartialRootRenderer encodeAll
    SEVERE: Error during partial-page rendering
    javax.faces.el.EvaluationException: javax.faces.el.EvaluationException: Error getting property ‘rows’ from bean of type org.mtahq.adffaces.csv.view.managed.CSVtoADFTableProcessor: java.lang.IllegalArgumentException: fromIndex(1) > toIndex(0)

Comments are closed.

Next Post

How to test for DBMS_OUTPUT with Quest Code Tester

What is the most common way of testing PL/SQL code? I think it is using DBMS_OUTPUT and manually verifying the results. Is it the best way of testing code? No, I don’t think so, but it is the most common way. Steven Feuerstein created Code Tester, and if you are […]
%d bloggers like this: