AMIS Oracle and Java Blog https://technology.amis.nl Friends of Oracle and Java Wed, 27 May 2015 13:26:44 +0000 en-US hourly 1 http://wordpress.org/?v=4.2.2 Introducing the Integration Cloud Service https://technology.amis.nl/2015/05/27/introducing-the-integration-cloud-service/ https://technology.amis.nl/2015/05/27/introducing-the-integration-cloud-service/#comments Wed, 27 May 2015 09:58:36 +0000 https://technology.amis.nl/?p=35744 Oracle released some more Cloud offerings and in this article we introduce the Integration Cloud Service. This cloud service lets your organization create integrations between cloud application, but also between cloud and on-premise applications. Create connections to well known and less known SaaS applications using a bunch of cloud adapters, publish or subscribe to the [...]

The post Introducing the Integration Cloud Service appeared first on AMIS Oracle and Java Blog.

]]>
Oracle released some more Cloud offerings and in this article we introduce the Integration Cloud Service. This cloud service lets your organization create integrations between cloud application, but also between cloud and on-premise applications. Create connections to well known and less known SaaS applications using a bunch of cloud adapters, publish or subscribe to the Messaging Cloud Service, or use industry standards like SOAP & REST. The available set of cloud adapters will certainly grow in the future when the marketplace is fully up-and-running.

Why should organizations consider the Cloud?
Let’s get started with the key benefits and features before diving into them more detailed. Why should organizations consider the Cloud?
In this day and age more and more software is going into the cloud, maybe they’re even developed with a cloud-first strategy. Thinks of your CRM, ERP of your HCM application. These applications do not do standalone business they communicate with each other, they exchange information. The Integration Cloud (fno ICS) provides this integrations and does it simplified.

The Cloud has a lot advantages, it is probably the most cost efficient method to use, maintain and upgrade an enterprise service bus.  It is available at much cheaper rates and hence, can significantly lower the company’s IT expenses. Besides, there are many pay-as-you-go and other scalable options available, which makes it very reasonable for your organization. Since all your data is stored in the cloud, backing it up and restoring the same is relatively much easier than storing the same on a physical device. Once you register yourself in the cloud, you can access the information from anywhere, where there is an Internet connection.

So what has the Integration Cloud Service to offer to meet these demands?

Simplified UI
ICS gives a web-based, point & click integration experience where you can easily create integrations between Cloud applications, public web services and on-premise applications.

Rich Connectivity
ICS has a standard library of Cloud & On-premise connectors which includes Oracle SaaS applications, but also connectors for the Messaging Cloud Service and industry standards like SOAP and REST.

Recommendations
The mapping builder to create the necessary mappings between the adapter connections has a build-in recommendation engine for guidance how to best map source to target fields.

Visibility & Error Detections
ICS as build-in a rich monitoring and error management. With advanced tracking you can easily spot inconsistencies and monitoring the usage and performance of integrations. It generates alerts, and even emails them, when connections fail to work. With the guided error handling the errors are easy to repair.

Overview of the Integration Cloud Service
Because it is fully web-based you only need to open a browser and go to the URL you received after creating your ICS instance. After signing in to the Integration Cloud Service you are welcomed by the home page.

ICS start page

The start page is constructed of a couple of tiles of each mayor functionality of ICS. Through this page you can easily learn more about a functionality, or you can navigate to that functionality. All the functionalities are part of the Designer Portal so my guess is that this page is not going to be used much or not at all. To navigate to the Designer Portal click on the associate menu item at the top right corner.

Designer Portal Page

The Designer Portal page shows the four pillars of ICS; Integrations, Connections, Lookups and Packages.

  • Integrations: Connect two cloud applications, using available connections, and define how they interact
  • Connections: Define connections to the cloud and on-promises applications
  • Lookups: Map the different values used by your applications to describe the same thing
  • Packages: A package associates to integrations and can be used as a way to group them

Before you can create integrations between cloud applications you need to define the connections. It is also possible to create SOAP and Messaging Cloud connections out of the box, but let’s look at the connections first.

Connections
At this moment there are almost ten adapters out-of-the-box available:

Oracle ERP Cloud Oracle ERP CloudConnector for the Oracle ERP Cloud GenericCloudConnector Rest AdapterGeneric Connector for REST APIs
GenericCloudConnector Web Service (Soap) AdapterGeneric Connector for Web Services Eloqua Marketing Cloud Eloqua (Marketing Cloud)Connector for the Oracle Marketing Cloud
Oracle Messaging Cloud Service Oracle Messaging Cloud ServiceConnector for the Messaging Cloud Service Oracle HCM Cloud Oracle HCM CloudConnector for the Human Capital Management Cloud
sales_92 Oracle Sales CloudConnector for the Oracle Sales Cloud Customer Service Cloud Oracle RightNowConnector for the Customer Service Support Cloud
GenericCloudConnector SalesforceConnector for the Salesforce CRM (SaaS)

Click on the Connections image on the Developer Portal page to navigate to the list of connections. By default all connections are listed. A connection can be in one of these three statuses; draft, in progress or configured. Draft means is is not 100% finished, in progress means a user is working on it right now, and configured means it is 100% done and the connection test was successful.

All connections

You can look at only connections that are in progress or configured by clicking on the status in the menu at the left side. If you’re looking for specific entries to look at you can search by entering the name of part of the name in the searchbox. You can use the * character as wildcard.

Search Connections

Each connections displays its name, version and the kind of application it connects to. Each kind of application has its own image to differentiate itself from one another.  Also the status and last update date and user is shown.

Connections Details

Also if you click on the Connection Details icon a overlay appears with more details like the who created the connection and when. On each connections some actions can be executed. A connection can be edited, cloned or deleted. Some connection allow the metadata to be refreshed like with the RightNow adapter.

Connection  Actions

Connection can be edited on the fly. If the WSDL url or the credentials change, the settings can be updated. Let’s look at the details of this RightNow connection.

Connection Settings

You can assign an email address of an administrator to the connection. This address is used to send notifications to when problems or changes occur in the connection. On the settings page, for this adapter, you can configure the connectivity and credentials.

Connections Connectivity Settings

Configure the WSDL of the RightNow Cloud service

Connection Credentials Settings

Configure the username and password to access the Cloud service with

Before a connection can be used by integrations it needs to be tested first. Click on the Test button on the top right corner and if the test is successful a green notification, and if it fails a red notification is displayed.

Test Connection

In a separate article, which will be published in the upcoming week(s), I will go in full details about creating connections.

Integrations
After defining the connections it is time to create a integration between two cloud connections. At this moment there are three types for integrations possible:

Blank Canvas Map My Data
Drop source and target onto a blank canvas
Publich Integrations Publish to ICS
Connect your source to send messages to ICS
Subscribe Integration Subscribe to ICS
Add targets to receive messages from ICS

Click on the Integrations image on the Developer Portal page to navigate to the list of integrations.

Designer Portal Integrations

By default all integrations are listed. An Integration can be in one of these five statuses; draft, in progress, configured, active or failed activation. Draft means it is not 100% finished, in progress means a user is working on it right now, configured means it is 100% done, active means a configured connections was successfully activated, and failed activation is an integration which had problems during activation.

All Integrations

You can look at only integrations that are in progress, configured, active or failed by clicking on the status in the menu at the left side.

Configured Integrations Active Integrations Failed Integrations

If you’re looking for specific entries to look at you can search by entering the name of part of the name in the search-box.
You can use the * character as wildcard, for example KV*.

Search Integrations

On a integration it is possible to execute a few actions based on its status. A connection can be viewed, edited, cloned, exported and deleted. Active connections can be deactivated. Some actions are disabled in certain statuses (e.g. it is not possible to edit an active integration).

Integration Actions

When viewing or editing an integration the Integration Canvas is used.

Integration Canvas

It consist of a source and target adapter connection. Between the adapters you can create mappings for the request and for the response flow. It is also possible to enrich data by calling a secondary adapter (callout). This is possible on both the request as response flow just after the source and target adapter.

Let’s have a look at the source adapter and the target adapter. In this example both are Generic SOAP connections. A Generic SOAP Connection can be created without the creation of a connection first.

SOAP Source wizard step 1

The first step consists of basic information and the choose to define the connection from an existing schema or in this example a WSDL.

SOAP Source wizard step 2

Secondly enter the WSDL URL and choose the Port Type and Operation to use for the incoming adapter. Besides a source every integration needs a target. In this example this is also a Generic SOAP connection, it works just like the source SOAP connection, but uses a different UI.

SOAP Target wizard

If extra data is needed that is not available in the request or reponse message of an adapter it is possible to use callouts to a secondary adapter connection.

Integration Canvas Callouts

Because the data type of the request is different than of the response the data needs to be mapped. Click on the Request Mapping to view, create or edit the mapping. The request mapping is straight forward. The input is mapped to the only field available.

Integration Request Mapping

The response mapping, maps the response from the target adapter to the source adapter. If you have call-outs the variable data is also available for this mapping. In the response mapping you can have access to a maximum of four data objects.

Integration Response Mapping

To view the XSLT mapping behind it or to create more advanced mappings, click on a target element name that you want to map. In this detailed view mode you can mapped source fields to target fields, view the used XSLT syntax and you have the possibility to edit the structure using Mapping Components.

Integration Mapping Builder

Mapping Components include functions for conversions, dates and strings, and Operators and XSL elements like choice, when, and other structures.

Integration Mapping Components

Below another example of a integration but this one connects a generic SOAP connection with the Oracle RightNow adapter. Both the Web Service and RightNow adapter support Faults to be passed through.

Integration Canvase with RightNow

Each adapter has it own kind of connection setup wizard. RightNow supports different operation modes (single or batch) and types (CRUD or ROQL). The CRUD operation type has four cloud operations; create, detroy, get and update. The RightNow adapter works with Business Objects defined in RightNow. It is possible to select multiple Business Objects.

Integration Rightnow

In a separate article, which will be published in the upcoming week(s), I will go in full details about creating integrations.

Lookups
The Integration Cloud Service also gives to possibility to map different values in your applications to describe the same thing, like currency codes. For everybody that uses SOA Suite, it’s a DVM (Domain Value Map). Click on the Lookups image on the Developer Portal page to navigate to the list of lookups.

Designer Portal Lookups

The Lookups page show all lookups in one list.

All Lookups

A few actions can be taken on each lookup. A lookup can be edited, cloned, exported and deleted.

Lookup Actions

A lookup is a table of connectors and domain value mappings. You can easily add other connectors or more values.

75_lookups_lookup_edit

When adding a connector column you first need to select the connector to assign values to. For example the Rest Adapter and enter the associated domain values.

Lookup Add Connector

Other to mention features are the possibility to export and import lookups. The export format is CSV.

Export Lookup

Lookup can be used in mappings between source and target integrations. Use the lookupValue function and select the source value to map.

Use Lookup

In a separate article, which will be published on the 26th of May, I will go in full details about creating and using lookups.

Packages
The last feature of ICS are packages. With packages you can group integrations together. When creating an integration you can assign it to specific package name. Multiple integrations can be assign to the same package name. Packages can be exported, imported and deleted, which mean integrations can easily be transported to a different ICS instance.

To view all integrations part of a package click on the “Action” icon and select “View Integrations”.

89_packages_package_actions

The pop-up shows the details about the integration, e.g. description, creator, last updater and optionally an Endpoint URL where the integration can be accessed on.

View Package

Recap
Oracle’s Integration Cloud Service is a hourly or monthly subscription based Cloud solution and bring  a web-based, point & click experience where you can easily create integrations between Cloud applications, (public) webservices and on-premise applications. It has a standard library of Cloud & On-premise connectors which includes Oracle SaaS applications, but also connectors for the Messaging Cloud Service and industry standards like SOAP and REST.


The post Introducing the Integration Cloud Service appeared first on AMIS Oracle and Java Blog.

]]>
https://technology.amis.nl/2015/05/27/introducing-the-integration-cloud-service/feed/ 0
Kan iemand mij uitleggen waar IT heen gaat? https://technology.amis.nl/2015/05/21/kan-iemand-mij-uitleggen-waar-it-heen-gaat/ https://technology.amis.nl/2015/05/21/kan-iemand-mij-uitleggen-waar-it-heen-gaat/#comments Thu, 21 May 2015 11:51:20 +0000 https://technology.amis.nl/?p=36101 “Als je het niet kan zeggen, zing het dan maar”. Dat was vroeger een veel gebruikte kreet als je niet onder woorden kon brengen of kon benoemen wat je wilde zeggen. Hier moet ik vaak aan terugdenken als ik de fantastische kretologieën voorbij zie komen die nieuwe trends of hypes in onze mooie IT wereld [...]

The post Kan iemand mij uitleggen waar IT heen gaat? appeared first on AMIS Oracle and Java Blog.

]]>
“Als je het niet kan zeggen, zing het dan maar”. Dat was vroeger een veel gebruikte kreet als je niet onder woorden kon brengen of kon benoemen wat je wilde zeggen. Hier moet ik vaak aan terugdenken als ik de fantastische kretologieën voorbij zie komen die nieuwe trends of hypes in onze mooie IT wereld beschrijven. Soms iets van oude wijn en nieuwe zakken of door alleen het oogstjaar te veranderen, maar vaak zijn de nieuwe kretologieën zo liederlijk dat je je afvraagt: wat bedoelen ze er nu eigenlijk mee?

Nummer 9 op de trendlijst van Gartner

Een mooi voorbeeld: Web Scale IT! Deze term staat op nummer 9 op de Trendlijst van Gartner voor 2015. De uitleg die Gartner hierbij geeft:

“Gartner notes that more companies will think, act, and build applications and infrastructure in the same way that technology stalwarts like Amazon, Google, and Facebook do. There will be an evolution toward web-scale IT as commercial hardware platforms embrace the new models and cloud-optimised and software-defined methods become mainstream.”

Snapt u hem nog? Weer een evolution? Embrace new models? Cloud-optimised? Software-defined?

Wat mij in Gartner’s trendlijst triggert

Om het af te maken zegt Gartner:

“Gartner notes that the marriage of development and operations in a coordinated way (referred to as DevOps) is the first step towards the web-scale IT.”

Wanneer ik de uitleg lees, valt mij op dat dit wel een heel erg IT gebaseerde benadering is. Terwijl men naar mijn mening juist wil dat IT en business samen met nieuwe ontwikkelingen aan de slag gaan. Dat ze vriendjes worden om gezamenlijk doelen te bereiken. Niet hij en zij maar wij, wij samen. Vervolgens komt de ene na de andere vraag en in mij op:

  • “Companies will build applications and infrastructure like Amazon, Google“? Gaan ze opnieuw beginnen? Bedoelen ze startende ondernemingen? Of is de wet van de remmende voorsprong vervallen? Heb ik geen toekomst meer met een 20 jarig of 120 jarig bedrijf?
  • Het gaat wel om hele grote stappen. We zijn net gewend aan cloud-based, gaan we al naar cloud-optimized. Wat is dat dan weer? De overtreffende trap? Niveau 2, 3 of 4?
  • Waar is de stem van de business, wanneer er wordt gesproken over software-defined methods en DevOps, the marriage between operations en development?
  • En wat bedoelen ze met ‘a coordinated way’? Is er soms een roadmap to success? Dan zou ik die graag willen ontvangen want ja, zo’n kans wil je niet laten lopen. Toch?

Weerstand tegen veranderingen

weerstand tegen veranderingZo zou ik nog wel even door kunnen gaan. Waar dit soort trends vooral aan voorbij gaat, is dat we te maken hebben met mensen. Mensen die vaak weerstand hebben tegen veranderingen en vooral de grotere veranderingen. En geef ze eens ongelijk! Want wat er niet verteld wordt, is wat de veranderingen voor een werknemer kunnen betekenen. Dat er bijvoorbeeld mogelijkheden zijn om zich verder te ontwikkelen of te scholen, dat zijn ervaring steeds crucialer wordt en dat vooral het saaie deel van het werk geautomatiseerd kan worden. Het is niet voor niets dat ruim 50% van de IT projecten fout afloopt omdat de gebruikers en andere stakeholders niet goed worden meegenomen…

Enorm veel treinen

Ook komen veranderingen vaak in een hoog tempo voorbij. Het is daarbij niet eenvoudig om te bepalen welke verandering je moet omarmen en welk je aan je voorbij laat gaan. Op welke rijdende trein spring je als bedrijf wel en op welke niet? Een ding is zeker, er rijden heel veel treinen. Je moet wel de juiste kiezen en ook nog op tijd springen.

Exponentiële groei?

En dan zal exponentiële groei zal de uwe zijn! Wat voor groei? Ja, exponentiële groei, oftewel snel en hard. Maar wat als ik niet exponentieel wil groeien, maar gewoon en gedegen wil ondernemen. Mag dat nog? Of is mijn bestaansrecht eigenlijk al weg op het moment dat ik dit denk? Exponentieel is te veel een toverwoord geworden, volgens mij te verheerlijkt. Wat is er mis met gedegen ondernemen, uiteraard wel blijven bewegen en veranderen, maar vooral goed luisteren naar wat de klant wil? En tuurlijk, je wilt niet op een dag wakker worden om te constateren dat de wereld om jou heen veranderd is en dat er nieuwe concurrentie opstaat. Niet elke industrie kent zijn Ubers en AirBnB’s. Er zijn gelukkig genoeg voorbeelden van al langer bestaande bedrijven die vandaag de dag nog steeds goed mee komen. Wakker blijven, ja uiteraard, maar niet perse om exponentieel te groeien.

Wat gaat het nu eigenlijk om?

Er komen dus heel veel vragen bij me naar boven bij nieuwe trends en kretologieën, zoals bij Web-scale IT. Het lijkt alsof ik alleen maar bezwaren opnoem en op de rem trap. Maar dit soort vragen zijn veelvuldig het onderwerp van gesprek als ik bij een CEO of andere leidinggevende aan tafel mag schuiven. Ze zijn hier allemaal mee bezig of minimaal over aan het nadenken. Ze denken dan vooral na over waar hun mogelijkheden liggen, waar zij in de toekomst hun bijdrage kunnen leveren en zij voor de werknemers en klanten van nu en in de toekomst een rol van belang kunnen spelen. Ik ben daarbij nog niemand tegengekomen die aangaf daar Web-scale IT bij nodig te hebben. Ze weten vaak niet eens dat de term bestaat en wat die dan inhoudt. Ligt het aan de naam en bedenkt de markt over 2 jaar wel weer iets anders?

Hoe dan ook: het gesprek over hoe de toekomst er voor een bedrijf uit zal zien is vele malen belangrijker dan de kretologieën die daar wel of niet bij horen. What’s in a name….

The post Kan iemand mij uitleggen waar IT heen gaat? appeared first on AMIS Oracle and Java Blog.

]]>
https://technology.amis.nl/2015/05/21/kan-iemand-mij-uitleggen-waar-it-heen-gaat/feed/ 1
Stream Explorer and JMS for both inbound and outbound interaction https://technology.amis.nl/2015/05/19/stream-explorer-and-jms-for-both-inbound-and-outbound-interaction/ https://technology.amis.nl/2015/05/19/stream-explorer-and-jms-for-both-inbound-and-outbound-interaction/#comments Tue, 19 May 2015 18:48:09 +0000 https://technology.amis.nl/?p=36131 In this article, we will look at the very common interaction between Stream Explorer and JMS. JMS is a commonly used channel for decoupled exchange of messages or events. Stream Explorer can both consume messages from a JMS destination (through Stream) and publish findings to a JMS destination (with a target). The use case we [...]

The post Stream Explorer and JMS for both inbound and outbound interaction appeared first on AMIS Oracle and Java Blog.

]]>
In this article, we will look at the very common interaction between Stream Explorer and JMS. JMS is a commonly used channel for decoupled exchange of messages or events. Stream Explorer can both consume messages from a JMS destination (through Stream) and publish findings to a JMS destination (with a target). The use case we discuss here is about temperature sensors: small devices distributed over a building, measuring the local room temperature every few seconds and reporting it over JMS. The Stream Explorer application has to look out for rooms with quickly increasing temperatures and report those over a second JMS queue. Note: this article describes the Java (SE) code used for generating temperature signals. This class generates temperature values (in Celsius!) for a number of rooms, and publishes these to the queue temperatureMeasurements. At some random point, the class will start a fire in a randomly selected room. In this room, temperatures will soon be over 100 degrees. Also in this article is Java class HotRoomAlertProcessor  that consumes messages from a second JMS Queue. Any message received on that queue is reported to the console.

Our objective in this article is to read the temperature measurements from the JMS Queue into a Stream Explorer application, calculate the average value per room and then detect the room on fire. This hot room should then be reported to the JMS Queue.

Open Stream Explorer and from the Stream Explorer Catalog page, create a new item of type Stream. Select JMS as the source type.

image

Press Next.

Configure the URL for the WebLogic domain (http://localhost:7101), the WebLogic Admin’s username and password (weblogic/weblogic1) and the JNDI Name for the JMS Queue (or Topic): jndi/ temperatureMeasurements

image

Press Next.

Define a new Shape. The properties in the JMS (Map)Message produced by the Java Class TemperatureSensorSignalPublisher are called RoomId (of type String) and Temperature (of type Float).
image

Press Create.

The Exploration editor appears to create an exploration based on the Stream.

Define a Name. Then click on Create.

image

The temperature measurement events start streaming in:

image

The first step is the definition of a Summary: calculate the average temperature per room. Also set the time range for the aggregation to 10 seconds (determine the temperature using the most recent 10 seconds worth of data) and the evaluation frequency to 5 seconds.

image

Fewer events are shown in the Live Output Stream – and with less variation.

Next, add a filter: we are going to hunt for the room on fire. Only records with an average temperature higher than 80 degrees should be reported. Also change the name of the property AVG_of_Temperature to AverageTemperature.

image

The screenshot shows that in this case, it is the Cafeteria where there is a fire. If you stop class TemperatureSensorSignalPublisher and then start it again, it will take some time for it to start a fire again and when the fire was started, the Live Output Stream will show it.

Finally, click on Configure Target.

Configure a JMS Target, as shown in the figure. The URL is the familiar one (t3://localhost:7101), username and password are weblogic and weblogic1 and the JNDI Name of the JMS target is jndi/hotRooms.
image

Click on Finish. Publish the Exploration.

When there is now a room discovered with temperatures in the hot zone, a message will be published to the JMS Queue, in the form of a MapMessage with properties RoomId and AverageTemperature.

Stop and start class TemperatureSensorSignalPublisher. Run class HotRoomAlertProcessor to have it start listening to the jndi/hotRooms queue.

The former writes:

image

And the latter will report hot rooms by writing a message to the console:

image

While the Stream Explorer browser interface shows:

image

The post Stream Explorer and JMS for both inbound and outbound interaction appeared first on AMIS Oracle and Java Blog.

]]>
https://technology.amis.nl/2015/05/19/stream-explorer-and-jms-for-both-inbound-and-outbound-interaction/feed/ 0
WebLogic Server and OpenLDAP. Using dynamic groups https://technology.amis.nl/2015/05/18/weblogic-server-and-openldap-using-dynamic-groups/ https://technology.amis.nl/2015/05/18/weblogic-server-and-openldap-using-dynamic-groups/#comments Mon, 18 May 2015 12:45:30 +0000 https://technology.amis.nl/?p=36072 Dynamic groups in an LDAP are groups which contain a query to specify its members instead of specifying every member separately. Efficient usage of dynamic groups makes user maintenance a lot easier. Dynamic groups are implemented differently in different LDAP server implementations. Weblogic Server can be configured to use dynamic groups in order to fetch [...]

The post WebLogic Server and OpenLDAP. Using dynamic groups appeared first on AMIS Oracle and Java Blog.

]]>
Dynamic groups in an LDAP are groups which contain a query to specify its members instead of specifying every member separately. Efficient usage of dynamic groups makes user maintenance a lot easier. Dynamic groups are implemented differently in different LDAP server implementations. Weblogic Server can be configured to use dynamic groups in order to fetch users for a specific group. In this blog I will describe how dynamic groups can be created in OpenLDAP and used in Weblogic Server.

In this example I use two users. smeetsm the developer and doej the operator. As shown in the title image, there are many servers which follow similar patterns to allow access to operators and developers. We are considering a case here where users do not use a shared account (e.g. weblogic) to login to different systems. This is for trace-ability and security purposes a better practice than when everyone uses the same shared user. See http://otechmag.com/magazine/2015/spring/maarten-smeets.html for a more thorough explanation on why you would want this.

A small note though. I’m a developer and this is not my main area of expertise. I have not implemented this specific pattern in any large scale organization.

Why dynamic groups?

In the group definition you can specify a query which determines members based on specific attribute values of users (e.g. privileges). What can you achieve with dynamic groups? You can provide an abstraction between users and groups which allows management of just user attributes to grant privileges. Groups which are usually per server, do not require as much changing this way. Since usually there are many servers (see example above) this saves a lot of time.

For example, you can use the departmentNumber attribute to differentiate what developers and operators can do on different machines. For readability I have misused the employeeType here since it allows string content. In the below image there are two users. smeetsm who is a developer and doej who is an operator. I have defined roles per server in the LDAP. The Monitor role on Server1 has smeetsm and doej as members because the memberURL query selects persons who have employeeType Developer or Operator. On Server1 only doej is Administrator and not smeetsm. This can for example be considered an acceptance test environment. On Server2 both are Administrator and Monitor. This can be considered a development environment. When smeetsm leaves and goes to work somewhere else, I just have to remove the Developer employeeType attribute at the user level and he won’t be able to access Server1 and Server2 anymore. So there is no problem anymore with forgetting which server which person has access to.

DevelopersAndOperators

OpenLDAP configuration

Install

First download OpenLDAP from http://sourceforge.net/projects/openldapwindows.

In order to reproduce the configuration I have used, download the configuration and LDAP export: here

Put the slapd.conf in <OpenLDAP INSTALLDIR>\etc\openldap

Check if the password specified for the administrator works. Not sure if the seed is installation dependent. You can generate a new password by going to <OpenLDAP INSTALLDIR>\bin and execute slappasswd -h {SSHA}

Start OpenLDAP by executing <OpenLDAP INSTALLDIR>\libexec\StartLDAP.cmd (or the shortcut in your startmenu)

Put the export.ldif in <OpenLDAP INSTALLDIR>\bin
Open a command-prompt and go to the <OpenLDAP INSTALLDIR>\bin

Execute ldapadd.exe -f export.ldif -xv -D “cn=Manager,dc=smeetsm,dc=amis,dc=nl” -w Welcome01

Now you can browse your OpenLDAP server using for example Apache Directory Studio. In my case I could use the following connection data (I used Apache Directory Studio to connect);

BindDN or user: cn=Manager,dc=smeetsm,dc=amis,dc=nl
Password: Welcome01

DevelopersAndOperators

The member field gets generated automatically (dynlist configuration in slapd.conf). This happens however after a search is performed. WebLogic can’t find this person if defined as a static group (I’ve enabled authentication debugging to see this in the log, Server, Debug, weblogic.security.Atn);

<search(“ou=Server1, ou=groups, dc=smeetsm, dc=amis, dc=nl”, “(&(member=cn=doej,ou=people,dc=smeetsm,dc=amis,dc=nl)(objectclass=groupofurls))”, base DN & below)>
<getConnection return conn:LDAPConnection {ldaps://localhost:389 ldapVersion:3 bindDN:”cn=Manager,dc=smeetsm,dc=amis,dc=nl”}>
<Result has more elements: false>

Unless you want to invest time in getting to know your specific LDAP server in order to make the dynamic groups transparent to the client (so you can access them in a similar way as static groups), you’re probably better of fixing this in WebLogic Server using dynamic groups (at least for development purposes). You can try however to let OpenLDAP produce memberof entries at the user level. This will perform better as WebLogic does not need to analyse all groups for MemberURL entries to determine in which group the user is present.

There are several tutorials available online for this (for example http://www.schenkels.nl/2013/03/how-to-setup-openldap-with-memberof-overlay-ubuntu-12-04/). Most however use OpenLDAPs online configuration (olc) and not slapd.conf. olc is the recommended way of configuring OpenLDAP and in most distributions the default. However not in the one I was using.

From slapd.conf to olc (optional)

This part is optional. It might help if you’re planning to take a dive into the depths of OpenLDAP (don’t forget the oxygen… I mean coffee). You can convert your slapd.conf to an online configuration as shown below.

See http://www.zytrax.com/books/ldap/ch6/slapd-config.html. I had some problems with creation of the slapd.d directory so I first create another directory called ‘t’ and rename it. It is a good idea to also rename the slapd.conf in order to make sure this configuration file is not used anymore.

cd <OpenLDAP INSTALLDIR>\etc
mkdir t
<OpenLDAP INSTALLDIR>\sbin\slaptest.exe -f openldap\slapd.conf -F t
move openldap\slapd.d
move openldap\slapd.conf openldap\slapd.conf.bak

Update the last line of <OpenLDAP INSTALLDIR>\libexec\StartLDAP.cmd to use the newly created directory for its configuration
slapd.exe -d -1 -h “ldap://%FQDN%/ ldaps://%FQDN%/” -F ..\etc\openldap\slapd.d

Create a user which can access cn=config. Update <OpenLDAP INSTALLDIR>\etc\openldap\slapd.d\cn=config\olcDatabase={0}config.ldif (from: http://serverfault.com/questions/514870/how-do-i-authenticate-with-ldap-via-the-command-line)

Add between
olcMonitoring: FALSE
and
structuralObjectClass: olcDatabaseConfig
the following lines. Use the same password as in the previously used slapd.conf (created with slappasswd -h {SSHA})

olcRootDN: cn=admin,cn=config
olcRootPW: {SSHA}2HdAW3UmR5uK4zXOVwxO01E38oYanHUa

Now you can use a graphical LDAP client to browse cn=config. Authenticate using cn=admin,cn=config and use cn=config as Base DN. This makes browsing and editing configuration easier.

cnconfig

To add a configuration file you can do the following for example;

<OpenLDAP INSTALLDIR>\bin>ldapadd.exe -f your_file.ldif -xv -D “cn=admin,cn=config” -w Welcome01

This will get you started with other online tutorials about how to get the memberof overlay working.

WebLogic configuration

In the WebLogic Console, Security Realms, myrealm, Providers, New, OpenLDAPAuthenticator.

Use the following properties;
Common: Control Flag. SUFFICIENT. Also set the control flag for the DefaultAuthenticator to SUFFICIENT.

Provider specific

Connection

  • Host: localhost
  • Port: 389
  • Principle: cn=Manager,dc=smeetsm,dc=amis,dc=nl
  • Credential: Welcome01

Users

  • User Base DN: ou=people, dc=smeetsm, dc=amis, dc=nl
  • All users Filter:
  • User from name filter: (&(cn=%u)(objectclass=inetOrgPerson))
  • User Search Scope: Subtree
  • User name attribute: cn
  • User object class: person
  • Use Retrieved User Name as Principal: (leave unchecked)

Groups

  • Group Base DN: ou=Server1, ou=groups, dc=smeetsm, dc=amis, dc=nl
  • All groups filter:
  • Group from name filter: (&(cn=%g)(|(objectclass=groupofnames)(objectclass=groupofurls)))
  • Group search scope: Subtree
  • Group membership searching: unlimited
  • Max group membership search level: 0

Static groups

  • Static Group Name Attribute: cn
  • Static Group Object Class: groupofnames
  • Static Member DN Attribute: member
  • Static Group DNs from Member DN Filter: (&(member=%M)(objectclass=groupofnames))

Dynamic groups

  • Dynamic Group Name Attribute: cn
  • Dynamic Group Object Class: groupofurls
  • Dynamic Member URL Attribute: memberurl
  • User Dynamic Group DN Attribute:

GUID Attribute: entryuuid

Points of interest

  • The group from name filter specifies two classes. The class for the static groups and the class for the dynamic groups.
  • User Dynamic Group DN Attribute is empty. If you can enable generation of the memberof attribute in your LDAP server, you can use that.
  • The Group Base DN specifies the server (Server1). For Server2 I would use Server2 instead of Server1.
  • You can use static and dynamic groups together and also nest them. In the below image, Test3 is a groupofnames with smeetsm as static member. Monitor is a dynamic group. Be careful though with the performance. It might not be necessary to search entire subtrees to unlimited depth.

result2

Result

After the above configuration is done, can login with user smeetsm on Server1 into the WebLogic Console and get the Monitor role while on Server2 with the same username, you get the Administrator role.

result

If I change the employeeType of smeetsm to operator, I get the Administrator role on Server1. If I remove the attribute, I cannot access any system. User management can easily be done this way on user level with very little maintenance needed on group level (where there usually are many servers) unless for example the purpose of an environment changes. Then the query to obtain users needs changing.

I could not get the memberof attribute working in my OpenLDAP installation. Luckily for a development environment you don’t need this but if you plan on using a similar pattern on a larger scale, you can gain performance by letting the LDAP server generate these attributes in order to allow clients (such as WebLogic Server) to get quick insight into user group memberships.

The post WebLogic Server and OpenLDAP. Using dynamic groups appeared first on AMIS Oracle and Java Blog.

]]>
https://technology.amis.nl/2015/05/18/weblogic-server-and-openldap-using-dynamic-groups/feed/ 0
Interacting with JMS Queue and Topic from Java SE https://technology.amis.nl/2015/05/16/interacting-with-jms-queue-and-topic-from-java-se/ https://technology.amis.nl/2015/05/16/interacting-with-jms-queue-and-topic-from-java-se/#comments Sat, 16 May 2015 15:33:22 +0000 https://technology.amis.nl/?p=36047 This article is just a quick post of some code I want to have easy access to. It runs in Java SE – outside any container in a stand alone JVM. It creates a connection with a JMS Queue. One class sends messages to the Queue, the other class registers as a listener and consumes [...]

The post Interacting with JMS Queue and Topic from Java SE appeared first on AMIS Oracle and Java Blog.

]]>
This article is just a quick post of some code I want to have easy access to. It runs in Java SE – outside any container in a stand alone JVM. It creates a connection with a JMS Queue. One class sends messages to the Queue, the other class registers as a listener and consumes messages from a different queue.

I have created the code in JDeveloper. It runs stand-alone and connects to a WebLogic Server where the JMS Queues (and JMS Server, JMS Module and JMS Connection Factory) have been created. (blog article http://blog.soasuitehandbook.org/setup-for-jms-resources-in-weblogic-chapter-6/ provides an example of how JMS resources are configured on WebLogic)

image

The project has two libraries associated with it: Java EE and WebLogic Remote Client.

image

 

The JDeveloper application TemperatureMonitoring (created for a Stream Explorer/Event Processing demonstration) contains two projects that each contain a single class. One project is HotRoomAlertProcessor with class HotRoomAlertProcessor that registers as a listener to the HotRooms queue. Any message received on that queue is reported to the console.

The second project is TemperatureSensors. It contains class TemperatureSensorSignalPublisher. This class generates temperature values (in Celsius!) for a number of rooms, and publishes these to the queue temperatureMeasurements. At some random point, the class will start a fire in a randomly selected room. In this room, temperatures will soon be over 100 degrees.

Class TemperatureSensorSignalPublisher, publishing to the JMS Queue:

package nl.amis.temperature;

import java.util.Hashtable;

import java.util.Random;

import javax.jms.JMSException;
import javax.jms.MapMessage;
import javax.jms.MessageProducer;
import javax.jms.Queue;
import javax.jms.QueueConnection;
import javax.jms.QueueConnectionFactory;
import javax.jms.QueueSession;
import javax.jms.Session;

import javax.naming.Context;
import javax.naming.InitialContext;
import javax.naming.NamingException;


public class TemperatureSensorSignalPublisher {
    public final static String JNDI_FACTORY = "weblogic.jndi.WLInitialContextFactory";
    public final static String JMS_FACTORY = "jms/handson-jms-connectionFactory";
    public final static String QUEUE = "jndi/temperatureMeasurements";
    private QueueConnectionFactory qconFactory;
    private QueueConnection qcon;
    private QueueSession qsession;
    private MessageProducer qproducer;
    private Queue queue;

    private static final int SLEEP_MILLIS = 100;
       private static Random rand = new Random();
       private boolean suspended;
       private int index = 0;

       public static int randInt(int min, int max) {
           // NOTE: Usually this should be a field rather than a method
           // variable so that it is not re-seeded every call.     
           // nextInt is normally exclusive of the top value,
           // so add 1 to make it inclusive
           int randomNum = rand.nextInt((max - min) + 1) + min;
           return randomNum;
       }

        public void run() {
            System.out.println("Started Producing Temperature Signals to "+QUEUE);
            suspended = false;
            while (!isSuspended()) { // Generate messages forever...
                generateTemperatureSensorSignal();
                try {
                    synchronized (this) {
                        wait(randInt(SLEEP_MILLIS/2, SLEEP_MILLIS*2));
                    }
                } catch (InterruptedException e) {
                    e.printStackTrace();
                }
            }
        }

        /* (non-Javadoc)
         * @see com.bea.wlevs.ede.api.SuspendableBean#suspend()
         */
        public synchronized void suspend() {
            suspended = true;
        }

        private synchronized boolean isSuspended() {
            return suspended;
        }

    String[] rooms = new String[]{"Cafeteria","Kitchen","Reception","Meetingroom One","CEO Office","Lounge Area","Office Floor A"};
    boolean onFire=false;
    int roomOnFireIndex ;

    private void generateTemperatureSensorSignal() {
        // determine roomId
        int roomIndex = randInt(1, rooms.length)-1;
        
        // determine if one room should be set on fire
        if (!onFire) {
            // chance of 1:500 that a fire is started
            onFire = randInt(1,50) < 2;
            if (onFire){
              roomOnFireIndex = roomIndex;
              System.out.println("Fire has started in room "+ rooms[roomOnFireIndex]);
            }
        }        
        // determine temperatureValue
        float temperature = randInt(160, 230)/11;
        if (onFire && roomIndex == roomOnFireIndex) {           
            temperature = temperature + randInt(90, 150);
        }
        publish(rooms[roomIndex], temperature);        
    }


    public void publish(String roomId, Float temperature) {
        try {
            MapMessage message = qsession.createMapMessage();
            message.setString("RoomId", roomId);
            message.setFloat("Temperature", temperature);
            qproducer.send(message);
            //System.out.println("- Delivered: "+temperature+" in "+roomId);
        } catch (JMSException jmse) {
            System.err.println("An exception occurred: " + jmse.getMessage());
        }
    }

    public void init(Context ctx, String queueName)
        throws NamingException, JMSException
    {
        qconFactory = (QueueConnectionFactory) ctx.lookup(JMS_FACTORY);
        qcon = qconFactory.createQueueConnection();
        qsession = qcon.createQueueSession(false, Session.AUTO_ACKNOWLEDGE);
        queue = (Queue) ctx.lookup(queueName);
        qproducer = qsession.createProducer(queue);
    }

    public void close() throws JMSException {
        qsession.close();
        qcon.close();
    }

    public static void main(String[] args) throws Exception {
        InitialContext ic = getInitialContext();
        TemperatureSensorSignalPublisher qr = new TemperatureSensorSignalPublisher();
        qr.init(ic, QUEUE);
        qr.run();
        qr.close();
    }

    private static InitialContext getInitialContext()
        throws NamingException    {
        Hashtable<String, String> env = new Hashtable<String, String>();
        env.put(Context.INITIAL_CONTEXT_FACTORY, JNDI_FACTORY);
        env.put(Context.PROVIDER_URL, "t3://localhost:7101");
        return new InitialContext(env);

    }
    
}

Class HotRoomAlertProcessor  consumes messages from a second JMS Queue:

package nl.amis.temperature;

import java.util.Enumeration;
import java.util.Hashtable;

import javax.jms.JMSException;
import javax.jms.MapMessage;
import javax.jms.Message;
import javax.jms.MessageListener;
import javax.jms.Queue;
import javax.jms.QueueConnection;
import javax.jms.QueueConnectionFactory;

import javax.jms.QueueReceiver;
import javax.jms.QueueSession;

import javax.jms.Session;

import javax.naming.Context;
import javax.naming.InitialContext;
import javax.naming.NamingException;


public class HotRoomAlertProcessor implements MessageListener {
    public final static String JNDI_FACTORY = "weblogic.jndi.WLInitialContextFactory";
    public final static String JMS_FACTORY = "jms/handson-jms-connectionFactory";
    public final static String QUEUE = "jndi/hotRooms";
    private QueueConnectionFactory qconFactory;
    private QueueConnection qcon;
    private QueueSession qsession;
    private QueueReceiver qreceiver;
    private Queue queue;
    private boolean quit = false;

    public void onMessage(Message msg)     {
        try {
            if (msg instanceof MapMessage) {
                MapMessage mess = ((MapMessage) msg);
//                Enumeration enumeration = mess.getMapNames();
//                while (enumeration.hasMoreElements()) {
//                    System.out.println(enumeration.nextElement());
//                }
                System.out.println("Room On Fire: " + mess.getString("RoomId"));
                System.out.println("Last Measured Temperature: " + mess.getFloat("AverageTemperature"));
            }
        } catch (JMSException jmse) {
            System.err.println("An exception occurred: " + jmse.getMessage());
        }
    }

    public void init(Context ctx, String queueName)
        throws NamingException, JMSException     {
        qconFactory = (QueueConnectionFactory) ctx.lookup(JMS_FACTORY);
        qcon = qconFactory.createQueueConnection();
        qsession = qcon.createQueueSession(false, Session.AUTO_ACKNOWLEDGE);
        queue = (Queue) ctx.lookup(queueName);
        qreceiver = qsession.createReceiver(queue);
        qreceiver.setMessageListener((MessageListener) this);
        qcon.start();
    }

    public void close() throws JMSException    {
        qreceiver.close();
        qsession.close();
        qcon.close();
    }

    public static void main(String[] args) throws Exception {
        InitialContext ic = getInitialContext();
        HotRoomAlertProcessor qr = new HotRoomAlertProcessor();
        qr.init(ic, QUEUE);
        System.out.println("JMS Ready To Receive Messages (To quit, send a \"quit\" message).");
        synchronized (qr) {
            while (!qr.quit) {
                try {
                    qr.wait();
                } catch (InterruptedException ie) {
                }
            }
        }
        qr.close();
    }

    private static InitialContext getInitialContext()
        throws NamingException
    {
        Hashtable<String, String> env = new Hashtable<String, String>();
        env.put(Context.INITIAL_CONTEXT_FACTORY, JNDI_FACTORY);
        env.put(Context.PROVIDER_URL, "t3://localhost:7101/");
        return new InitialContext(env);
    }
}

Here is some output from the second class:

image

The post Interacting with JMS Queue and Topic from Java SE appeared first on AMIS Oracle and Java Blog.

]]>
https://technology.amis.nl/2015/05/16/interacting-with-jms-queue-and-topic-from-java-se/feed/ 0
AMIS organiseert: Workshop Stream Explorer and Oracle Event Processor – Dinsdag 19 mei 2015, 17.30 uur https://technology.amis.nl/2015/05/16/amis-organiseert-workshop-stream-explorer-and-oracle-event-processor-dinsdag-19-mei-2015-17-30-uur/ https://technology.amis.nl/2015/05/16/amis-organiseert-workshop-stream-explorer-and-oracle-event-processor-dinsdag-19-mei-2015-17-30-uur/#comments Sat, 16 May 2015 05:11:41 +0000 https://technology.amis.nl/?p=36039 Dinsdag 19 mei vanaf 17.30 uur vindt bij AMIS (Edisonbaan 15, Nieuwegein) een gratis community workshop plaats over Stream Explorer en Oracle Event Processor in het kader van de AMIS SOA SIG. Lucas Jellema zal een presentatie geven waarin hij Stream Explorer introduceert. Hij laat een aantal demonstraties zien van Stream Explorer, OEP en de [...]

The post AMIS organiseert: Workshop Stream Explorer and Oracle Event Processor – Dinsdag 19 mei 2015, 17.30 uur appeared first on AMIS Oracle and Java Blog.

]]>
imageDinsdag 19 mei vanaf 17.30 uur vindt bij AMIS (Edisonbaan 15, Nieuwegein) een gratis community workshop plaats over Stream Explorer en Oracle Event Processor in het kader van de AMIS SOA SIG. Lucas Jellema zal een presentatie geven waarin hij Stream Explorer introduceert. Hij laat een aantal demonstraties zien van Stream Explorer, OEP en de interactie met SOA Suite 12c. Vervolgens krijgens deelnemers de beschikking over een Virtual Machine (Stream Explorer, OEP, SOA Suite 12c, Oracle Database 11gR2 XE en JDeveloper 12c ) waarin een ruime set aan praktijkvoorbeelden kan worden doorgenomen. In de handson sessie komen ondermeer aan bod:

  • Stream Explorer Aggregatie en Pattern Detectie
  • Interactie met SOA Suite via Event Delivery Network
  • REST en Stream Explorer (inbound en outbound)
  • Stream Explorer en Web Sockets voor live dashboards
  • Gebruik van Stream Explorer voor live monitoring van service execution in SOA Suite
  • Events rechtstreeks gepubliceerd vanuit de Oracle Database
  • Stream Explorer en JMS (inbound en outbound)
  • Bewerken van Stream Explorer applicaties in de OEP IDE (JDeveloper) – om de kracht van OEP toe te voegen aan het gebruiksgemak van Stream Explorer

Als je interesse hebt om mee te doen aan de workshop, stuur dan een email naar info @ amis.nl.

NB: voor de handson is een laptop nodig met minimaal 8 GB RAM en 25 GB vrije schijfruimte. Je hebt eigenlijk geen specifieke voorkennis nodig – de meeste acties met Stream Explorer kunnen immers voor business users worden uitgevoerd. Onderwerpen die langskomen zijn o.a. Java, PL/SQL, JSON, REST, JMS, WebSocket, JavaScript, HTML, CQL, XML, EDN, SOA Suite (Mediator, BPEL).

De hands-on instructions voor de workshop kunnen hier worden gedownload.

Een paar van de onderwerpen die aan bod komen gevisualiseerd:

 

 

image

image

image

image

image

image

The post AMIS organiseert: Workshop Stream Explorer and Oracle Event Processor – Dinsdag 19 mei 2015, 17.30 uur appeared first on AMIS Oracle and Java Blog.

]]>
https://technology.amis.nl/2015/05/16/amis-organiseert-workshop-stream-explorer-and-oracle-event-processor-dinsdag-19-mei-2015-17-30-uur/feed/ 0
StreamExplorer pushing findings as JSON messages to a WebSocket channel for live HTML Dashboard updates https://technology.amis.nl/2015/05/15/streamexplorer-pushing-findings-as-json-messages-to-a-websocket-channel-for-live-html-dashboard-updates/ https://technology.amis.nl/2015/05/15/streamexplorer-pushing-findings-as-json-messages-to-a-websocket-channel-for-live-html-dashboard-updates/#comments Fri, 15 May 2015 08:32:00 +0000 https://technology.amis.nl/?p=36008 A common desire when doing real time event processing with Stream Explorer and/or Oracle EVent Processor is the ability to present the findings from Stream Explorer in a live dashboard. This dashboard should hold a visualization of whatever information we have set up Stream Explorer to find for us – and it should always show [...]

The post StreamExplorer pushing findings as JSON messages to a WebSocket channel for live HTML Dashboard updates appeared first on AMIS Oracle and Java Blog.

]]>
A common desire when doing real time event processing with Stream Explorer and/or Oracle EVent Processor is the ability to present the findings from Stream Explorer in a live dashboard. This dashboard should hold a visualization of whatever information we have set up Stream Explorer to find for us – and it should always show the latest information.

User interfaces are commonly presented in web browsers and created using HTML(5) and JavaScript. As part of the HTML5 evolution that brought today’s browsers, we now have the ability to use Web Sockets through which we can push information from server to browser to have the user interface updated based on messages pushed from the server. This allows us to create a dashboard that listens from the browser to a Web Socket and use whatever messages appear on the web socket to actualize the user interface. Such a dashboard and its implementation using standard Java (EE) was discussed in a recent article: Java Web Application sending JSON messages through WebSocket to HTML5 browser application for real time push. The results from that article provide the foundation for this article you are reading right now.

We will create a Stream Explorer application that exposes a REST interface to which we will publish JSON messages (in this example using SoapUI as the client from which to generate the test events). These messages report on groups of people entering or leaving a specific room in a movie theater. The exploration we create will aggregate the information from the messages – providing us with a constant insight in the total number of people in each room. This information is subsequently pushed to the REST service exposed by a Java EE application that routes that information across the web socket to the HTML5 client. The next figure illustrates the application architecture:

image

In this article, we will assume that Java EE application including the dashboard are already available, as described in the referenced article. All we need to do is

  • Create a Stream exposed as (inbound) REST interface – discussed in this article.
  • Create an Exploration on top of this Stream – to aggregate the events from the Stream.
  • Configure a target for this Exploration using the outbound REST adapter (an example of which is discussed here) and publish the exploration.
  • Run the Java EE application, open the dashboard and publish messages to the Stream Explorer REST service; watch the dashboard as it constantly updates to reflect the actual status

 

After configuring the Stream (as discussed in this article), create an exploration, for example called CinemaExploration. Create a Summary of type SUM based on the property partySize and group by room. Edit the Properties and change the name of property SUM_of_partySize to occupation. The exploration will look like this:

 

image

We can start pushing some messages to it from SoapUI:

image

based in part on twice sending this SoapUI request:

image

 

Next, click on Configure a Target.

image

Select type REST and set the URL

image

Click on Finish.

Publish the Exploration.

image

 

The dashboard is opened:

image

Now we can run a test case in SoapUI to send test messages to the Stream Explorer application:

image

 

Here is what the live output stream in the Stream Explorer UI shows next to a screenshot taken of the Cinema Monitor dashboard:

image

The dashboard is constantly updated with the most recent finding published by Stream Explorer.Note: the notion of having a negative occupancy is one that will require some explaining! I(more careful test data management seems to be called for)

After running some more of the SoapUI Test Cases that publish cinema events to the REST ful entry point to the Stream Explorer application, the situation is as follows:

image

The post StreamExplorer pushing findings as JSON messages to a WebSocket channel for live HTML Dashboard updates appeared first on AMIS Oracle and Java Blog.

]]>
https://technology.amis.nl/2015/05/15/streamexplorer-pushing-findings-as-json-messages-to-a-websocket-channel-for-live-html-dashboard-updates/feed/ 0
Weblogic Console and BPM Worklist. Authentication using OpenLDAP https://technology.amis.nl/2015/05/15/weblogic-console-and-bpm-worklist-authentication-using-openldap/ https://technology.amis.nl/2015/05/15/weblogic-console-and-bpm-worklist-authentication-using-openldap/#comments Fri, 15 May 2015 07:47:23 +0000 https://technology.amis.nl/?p=36010 In this blog I will illustrate how you can configure Weblogic Server to use OpenLDAP as authentication provider and to allow OpenLDAP users to login to the Oracle BPM Worklist application. In a previous blog I have already shown how to do Weblogic Authentication with ApacheDS (LDAP and Weblogic; Using ApacheDS as authentication provider for [...]

The post Weblogic Console and BPM Worklist. Authentication using OpenLDAP appeared first on AMIS Oracle and Java Blog.

]]>
In this blog I will illustrate how you can configure Weblogic Server to use OpenLDAP as authentication provider and to allow OpenLDAP users to login to the Oracle BPM Worklist application. In a previous blog I have already shown how to do Weblogic Authentication with ApacheDS (LDAP and Weblogic; Using ApacheDS as authentication provider for Weblogic). In this blog I will use OpenLDAP to also do BPM Worklist authentication.

title

Why use OpenLDAP?

Oracle Platform Security Services (OPSS) supports the use of several authentication providers. See: http://docs.oracle.com/cd/E23943_01/core.1111/e10043/devuserole.htm#JISEC2474. OpenLDAP is the only open source provider available in this list.

  • Microsoft Active Directory
  • Novell eDirectory
  • Oracle Directory Server Enterprise Edition
  • Oracle Internet Directory
  • Oracle Virtual Directory
  • OpenLDAP
  • Oracle WebLogic Server Embedded LDAP Directory
  • Microsoft ADAM
  • IBM Tivoli

When you can use a certain provider for Weblogic authentication, this does not automatically mean you also use this user in Fusion Middleware applications which use JPS such as the BPM Worklist application. Possible authentication providers in Weblogic Server cover a wider range of servers and mechanisms than can be used in JPS out of the box.

What causes this limitation? Well, most Fusion Middleware Applications (all as far as I’ve seen) can only look at the first LDAP provider for authentication. This is usually the default authenticator (Weblogic Embedded LDAP server). When I add another LDAP authenticator, it will be ignored. The solution is straightforward; use a single LDAP. Of course if you don’t want that, you can also virtualize several LDAPs and offer them as a single LDAP for the application to talk to. The most common solutions for this are; Oracle Virtual Directory (OVD, http://docs.oracle.com/cd/E12839_01/oid.1111/e10036/basics_10_ovd_what.htm) and LibOVD. Oracle Virtual Directory is a separate product. LibOVD is provided with Weblogic Server but does not have its own web-interface and is limited in functionality (and configuration is more troublesome in my opinion). When (for example for ApacheDS) you specify the generic LDAPAuthenticator and not a specific one such as for OpenLDAP, you need to specify an idstore.type in the jps-config.xml in DOMAINDIR\config\fmwconfig. This idstore.type is limited to the list below (see https://docs.oracle.com/cd/E14571_01/core.1111/e10043/jpsprops.htm#JISEC3159);

  • XML
  • OID – Oracle Internet Directory
  • OVD – Oracle Virtual Directory
  • ACTIVE_DIRECTORY – Active Directory
  • IPLANET – Sun Java System Directory Server
  • WLS_OVD – WebLogic OVD
  • CUSTOM – Any other type

Custom can be any type, but mind you that if you specify custom, you will also need to specify an implementation of the oracle.security.idm.IdentityStoreFactory interface in the property ‘ADF_IM_FACTORY_CLASS’ and here you are limited or you have to build your own. When using OpenLDAP, you don’t have this problem.

Configuring OpenLDAP

Installing

This has been described on various other blogs such as https://blogs.oracle.com/jamesbayer/entry/using_openldap_with_weblogic_s and http://biemond.blogspot.nl/2008/10/using-openldap-as-security-provider-in.html. I’ll not go into much detail here, just describe what I needed to do to get it working.

First install OpenLDAP. I used a Windows version since at the time of writing this blog I was sitting behind a Windows computer. http://sourceforge.net/projects/openldapwindows. There are also plenty of other versions. The benefit of this version (I downloaded 2.4.38) is that it pretty much works out of the box. I updated part of the etc\openldap\slapd.conf file which you can see below to provide my own domain and update the Manager password. The password (you can make a SSHA version of this by looking at https://onemoretech.wordpress.com/2012/12/17/encoding-ldap-passwords/) is ‘Welcome01′ in my case. There are also a couple of other references to the dc=example,dc=com domain in the config file and you should replace those also.

#######################################################################
# BDB database definitions
#######################################################################
database bdb
suffix "dc=smeetsm,dc=amis,dc=nl"
rootdn "cn=Manager,dc=smeetsm,dc=amis,dc=nl"
# Cleartext passwords, especially for the rootdn, should
# be avoid. See slappasswd(8) and slapd.conf(5) for details.
# Use of strong authentication encouraged.
rootpw {SSHA}2HdAW3UmR5uK4zXOVwxO01E38oYanHUa
# The database directory MUST exist prior to running slapd AND
# should only be accessible by the slapd and slap tools.
# Mode 700 recommended.
directory ../var/openldap-data
# Indices to maintain

index default pres,eq
index objectClass eq
index uniqueMember eq

access to attrs=userPassword
by dn="cn=Manager,dc=smeetsm,dc=amis,dc=nl" write
by anonymous auth
by * none

access to dn.base=""
by * read

access to *
by dn="cn=Manager,dc=smeetsm,dc=amis,dc=nl" write
by * read

access to *
by dn="cn=root,dc=smeetsm,dc=amis,dc=nl" write
by * read

Adding users

Commandline with an ldif file

I used Apache Directory Studio to add users in a graphical way (described below). The result I exported to the below ldif file (all passwords are ‘Welcome01′). After you have done this you have a sample Administrator user and group available which will correspond to the below Weblogic Server configuration. You can save the below file in base.ldif.

version: 1

dn: dc=smeetsm,dc=amis,dc=nl
objectClass: top
objectClass: domain
dc: smeetsm

dn: ou=people,dc=smeetsm,dc=amis,dc=nl
objectClass: top
objectClass: organizationalUnit
ou: people

dn: cn=smeetsm,ou=people,dc=smeetsm,dc=amis,dc=nl
objectClass: top
objectClass: person
objectClass: organizationalPerson
objectClass: inetOrgPerson
cn: smeetsm
sn: Smeets
userPassword:: e3NzaGF9Y1lEOE9hM09IdjhGWjFQSVZPWG9DMTFHeDBvQThZcVV1TGV5aVE9P
Q==

dn: ou=groups,dc=smeetsm,dc=amis,dc=nl
objectClass: top
objectClass: organizationalUnit
ou: groups

dn: cn=Administrators,ou=groups,dc=smeetsm,dc=amis,dc=nl
objectClass: top
objectClass: groupOfNames
cn: Administrators
member: cn=smeetsm,ou=people,dc=smeetsm,dc=amis,dc=nl

On an empty database (configured with the slapd.conf above) you can import this like;

ldapadd.exe -f base.ldif -xv -D “cn=Manager,dc=smeetsm,dc=amis,dc=nl” -w Welcome01
(ldapadd.exe is in the bin directory of my OpenLDAP installation)

With a GUI (Apache Directory Studio)

Download Apache Directory Studio from: https://directory.apache.org/studio/. First create a connection in Apache Directory Studio. Use the same login data as specified in the slapd.conf file.

Host: localhost port: 389
BindDN or user: cn=Manager,dc=smeetsm,dc=amis,dc=nl
Password: Welcome01

Next, right-click Root DSE. Add a new entry. Create from scratch. Add the ‘domain’ object class.

AddANewEntry

Specify parent: ‘dc=smeetsm,dc=amis,dc=nl’
Specify RDN: ‘dc=smeetsm’

AddADomain

Using a similar method, you can look at the ldif file above to add the other entries. You only have to add the last class per object as the other classes are its super-classes (check though). The end result will be;

EndResult

Weblogic Server configuration

Authentication provider configuration

This part has been described in other posts as well. I’ll just shortly repeat it here for thoroughness.

In your security realm add a new authentication provider, select OpenLDAPAuthenticator. Fill in the below details;

Group Base DN: ou=groups,dc=smeetsm,dc=amis,dc=nl
Static Group Object Class: groupOfNames
User Base DN: ou=people,dc=smeetsm,dc=amis,dc=nl
User Object Class: inetOrgPerson
Principal: cn=Manager,dc=smeetsm,dc=amis,dc=nl
Host: localhost
Credential: Welcome01
Static Group DNs from Member DN Filter: (&(member=%M)(objectclass=groupOfNames))
User From Name Filter: (&(cn=%u)(objectclass=inetOrgPerson))
Group From Name Filter: (&(cn=%g)(objectclass=groupOfNames))

Mind that the DefaultAuthenticator and your newly created authenticator should both have their control flag set to SUFFICIENT.

You can now use the new user to login to the Weblogic Console and Enterprise Manager. In this example I have added the user to the Administrators group. If you don’t want that, you can create your own group and add the users to that group. The user won’t be able to login to the Weblogic Console but using the worklist application will work if the below configuration is also done.

LibOVD configuration

You can enable LibOVD as specified on http://fusionsecurity.blogspot.nl/2012/06/libovd-when-and-how.html. Set the virtualize=true property from the Enterprise Manager Fusion Middleware control. Click the arrow before Security Provider, Click configure and add the property.

SecurityProviderConfiguration

In order to allow people to login to the worklist application, they should be able to login or have a valid role as you can see in the screenshot below. You can of course also make this more specific.

ValidUsersCanLogin

Thus after the virtualize=true property has been set (and the server has been restarted), you can add users to your OpenLDAP and they can be assigned tasks. I do recommend though when working with tasks to map the application roles to LDAP groups and not to specific users directly. This will make management of the users a lot easier at a later stage (especially when working with Organizational Units).

Now you can use the Oracle BPM Worklist application to login and do things. You don’t have any assigned tasks though so you won’t see much yet but you can assign them to this user or the group it belongs to.

BPMWorkList

Resources

OVD JPS properties
https://docs.oracle.com/cd/E14571_01/core.1111/e10043/jpsprops.htm#JISEC3159

OpenLDAP with Weblogic
https://blogs.oracle.com/jamesbayer/entry/using_openldap_with_weblogic_s

OpenLDAP Windows
http://sourceforge.net/projects/openldapwindows

Encoding LDAP passwords
https://onemoretech.wordpress.com/2012/12/17/encoding-ldap-passwords/

LibOVD idstore.type for ApacheDS?
https://twitter.com/lucasjellema/status/525953726453137408

Identity store providers
http://docs.oracle.com/cd/E23943_01/core.1111/e10043/devuserole.htm#JISEC2718

LibOVD when and how?
http://fusionsecurity.blogspot.nl/2012/06/libovd-when-and-how.html

The post Weblogic Console and BPM Worklist. Authentication using OpenLDAP appeared first on AMIS Oracle and Java Blog.

]]>
https://technology.amis.nl/2015/05/15/weblogic-console-and-bpm-worklist-authentication-using-openldap/feed/ 0
Java Web Application sending JSON messages through WebSocket to HTML5 browser application for real time push https://technology.amis.nl/2015/05/14/java-web-application-sending-json-messages-through-websocket-to-html5-browser-application-for-real-time-push/ https://technology.amis.nl/2015/05/14/java-web-application-sending-json-messages-through-websocket-to-html5-browser-application-for-real-time-push/#comments Thu, 14 May 2015 07:55:59 +0000 https://technology.amis.nl/?p=35980 This article describes a Java EE 7 web application that exposes a REST service that handles HTTP POST requests with JSON payload. Any message received is sent through a Web Socket to the web socket (server) endpoint that is published by a Java Class deployed as part of the web application. A static HTML page [...]

The post Java Web Application sending JSON messages through WebSocket to HTML5 browser application for real time push appeared first on AMIS Oracle and Java Blog.

]]>
imageThis article describes a Java EE 7 web application that exposes a REST service that handles HTTP POST requests with JSON payload. Any message received is sent through a Web Socket to the web socket (server) endpoint that is published by a Java Class deployed as part of the web application. A static HTML page with two associated JavaScript libraries is opened in a web browser and has opened a web socket connection to this same end point. The message sent from the REST service endpoint to the web socket endpoint is pushed through the web socket to the browser and used to instantly update the web page.

The specific use case that is implemented is a simple web dashboard to monitor a movie theater: the current number of people in each of the four rooms of this movie theater is observed. The REST service receives the actual spectator count and through the two web socket interactions, this count ends up in the browser and in the visual presentation.

Below you will find a step by step instruction for implementing this use case including all required source code. The implementation uses only standard technologies: Java EE 7 (including JAX-RS and Web Socket ) and plain HTML5 and JavaScript – no special libraries are involved. The code is developed in Oracle JDeveloper (12c) and deployed to Oracle WebLogic  (12c). However, given that only standard components are used, the same code should work equally well on other containers and from other IDEs.

Note: for the use case presented in this article, a somewhat simpler solution would be possible using Server Sent Events (SSE) – a simpler and lighter weight approach than the use of web sockets. SSE is uni-directional (server to client push only) and that of course is exactly what I am doing in this particular example.

The steps will be:

  • Implement the REST service to handle json payloads in HTTP Post requests
  • Implement the WebSocket endpoint
  • Interact from REST service endpoint with WebSocket (endpoint)
  • Implement the HTML and JavaScript web application to present the live status for the movie theater based on the web socket messages

The starting point is a basic Java EE web application – with no special setup in web.xml or other files.

The final application looks like this:

image

For JDeveloper 12c users: the required libraries are JAX-RS Jersey Jettison (Bundled), JAX-RS Jersey 2.x, WebSocket, Servlet Runtime.

Implement the REST service to handle json payloads in HTTP Post requests

Publishing a REST service from Java (EE) is done using JAX-RS. In an earlier post, I described how to expose a REST service from Java SE (out of Java EE containers, leveraging the HTTP Server in the JVM). Publishing from within a Java EE container is very similar – and even easier. All we need is a single class with the right annotations.

image

The class is shown below. It is called MovieEvent (although the Class name does not matter at all). The class is annotated with the @Path annotation that is part of the JAX-RS specification. Because of this annotation, the class is published as a REST resource. The string parameter in this annotation (“cinemaevent”) defines the resource name as used in the URL for this REST service. The entire URL where this service can be invoked will be http://host:port/<web application root>/resources/cinemaevent. Note the segment resources that comes between the web application root and the name of the resource. That one had me confused for some time. The web application root for this application is set to CinemaMonitor by the way.

package nl.amis.cinema.view;

import javax.ws.rs.Consumes;
import javax.ws.rs.GET;
import javax.ws.rs.POST;
import javax.ws.rs.Path;
import javax.ws.rs.Produces;
import javax.ws.rs.core.Context;
import javax.ws.rs.core.Request;

//invoke at : http://localhost:7101/CinemaMonitor/resources/cinemaevent
@Path("cinemaevent")
public class MovieEvent {

public MovieEvent() {
}

@POST
@Consumes("application/json")
@Produces("text/plain")
public String postMovieEvent(@Context Request request, String json) {
System.out.println("received event:" + json);
return "event received " + json;
}

@GET
@Produces("text/plain")
public String getMovieEvent(@Context Request request) {
return "nothing to report from getMovieEvent";
}
}

Method postMovieEvent is annotated with @POST – making it the handler of POST requests sent to the REST resource (at the URL discussed above). However, the annotation @Consumes(“application/json”) ensures that only HTTP requests with their content-type appropriately set to application/json will indeed be delivered to this method. The JSON payload of these requests is passed in the json input parameter. Note that not its name is relevant but the fact that it is the first String input parameter to this method without special annotations such as @Context.

At the present, the method does nothing useful – it write the JSON payload to the system output and returns a fairly bland confirmation message. Before too long, we will extend both this method and the entire class to interact with the web socket.

The REST service can be tested, for example from SoapUI or a Java client program by sending requests such as this one:

image

The corresponding output in the console (from the running Java EE container):

image

Implement the WebSocket endpoint

Implementing a WebSocket endpoint in Java EE is defined through the JSR-356 specification. A very good overview of how to use web sockets in Java applications is provided in this article.

Turning a Java Class into a WebSocket endpoint is actually quite simple. Use a few annotations, and we are in business. It is important to realize that even though the class that acts as the Web Socket server (endpoint) is deployed in this case as part of a Java EE web application, it stands quite apart from the rest of that application. The web socket endpoint can be accessed, not just from browsers but from Java clients as well. But there is no instance of the Class that is accessible for direct Java calls, nor does the WebSocket endpoint hook into EJBs, JMS destinations or JSF managed beans. It is a rather isolated component within the Java EE application. It does however have the potential to consume CDI Events (as described by Bruno Borges in this excellent article that inspired me to write this piece).

Create a Class called CinemaEventSocketMediator. Add the following annotation: @ServerEndpoint(“/cinemaSocket/{client-id}”). This turns the class into a web socket endpoint that exposes a Web Socket [channel]at ws://host:port/<web application context root>/cinemaSocket (in this case that will be ws://localhost:7101/CinemaMonitor/cinemaSocket). The final segment of the URL (/{client-id}) introduces a path parameter. The address of the Web Socket ends with the segment cinemaSocket. Anything that is added behind it is interpreted as an additional parameter that can be leveraged in the onOpen, onClose and onMessage methods through the @PathParam annotation – as we will see next.

A collection of peers is defined in which each client that starts a web socket connection will be retained.

The onOpen method – or rather the method annotated with @OnOpen – is invoked when a new client starts communications over the web socket channel. This method saves the session to the peers collection and returns a welcoming message to the new contact. Note how through the @PathParam annotated input parameter the method knows a little bit more about the client, provided the client did indeed add some content after the ‘regular’ Web Socket URL.

The method decorataed with @OnMessage is triggered when a message arrives on the Web Socket [channel]. In this case, the message is received and instantiated as a JSONObject. This would allow us to perform JSON style operations on the message (extract nested data elements, manipulate and add data). However, at the present, all we do is pass the message to each of the peers, regardless where the message came from (client-id) or what contents it contains. Note that the method would have to handle an exception if the message were not correct JSON data.

Finally the method with @OnClose handles clients closing their web socket channel connection. These clients are removed from the peers collection.

This particular WebSocket endpoint does not do anything that is special for the use case at hand. There are no references no movies, cinema events or whatever in this class. There could be logic that interprets messages, routes based on their content for example, but there does not need to be such business specific logic.

 

package nl.amis.cinema.view.push;

import java.io.IOException;

import java.util.Collections;
import java.util.HashSet;
import java.util.Set;

import javax.websocket.OnClose;
import javax.websocket.OnMessage;
import javax.websocket.OnOpen;
import javax.websocket.Session;
import javax.websocket.server.PathParam;
import javax.websocket.server.ServerEndpoint;

import org.codehaus.jettison.json.JSONException;
import org.codehaus.jettison.json.JSONObject;

@ServerEndpoint("/cinemaSocket/{client-id}")
public class CinemaEventSocketMediator {

private static Set peers = Collections.synchronizedSet(new HashSet());

@OnMessage
public String onMessage(String message, Session session, @PathParam("client-id") String clientId) {
try {
JSONObject jObj = new JSONObject(message);
System.out.println("received message from client " + clientId);
for (Session s : peers) {
try {
s.getBasicRemote().sendText(message);
System.out.println("send message to peer ");
} catch (IOException e) {
e.printStackTrace();
}

}
} catch (JSONException e) {
e.printStackTrace();
}
return "message was received by socket mediator and processed: " + message;
}

@OnOpen
public void onOpen(Session session, @PathParam("client-id") String clientId) {
System.out.println("mediator: opened websocket channel for client " + clientId);
peers.add(session);

try {
session.getBasicRemote().sendText("good to be in touch");
} catch (IOException e) {
}
}

@OnClose
public void onClose(Session session, @PathParam("client-id") String clientId) {
System.out.println("mediator: closed websocket channel for client " + clientId);
peers.remove(session);
}
}

Interact from REST service endpoint with WebSocket (endpoint)

The JSON messages received by the REST service exposed by class MovieEvent should be pushed through the Web Socket to the (external) clients, i.e. the web browser. There are two main options to hand these messages from class MovieEvent to the CinemaEventSocketMediator. One is through the use of CDI Events (as was mentioned above) and the other is by making class MovieEvent another client of the Web Socket [channel]exposed by CinemaEventSocketMediator. In this case, we opt for the latter strategy. Note that this means that there is no need for the MovieEvent class and the CinemaEventSocketMediator class to be in the same web application; their only interaction takes place across the web socket and they have no dependencies. I have them included in the same application for easy deployment. The same applies by the way to the client side of this article: the HTML and JavaScript that are loaded by the browser to present the dashboard to the end user. This too is currently included in the same web application and it too only has interaction over the web socket. There is no real reason for it to be part of the same application.

Using an excellent description on StackOverflow, I have created class MovieEventSocketClient with the @ClientEndpoint annotation. This class acts as a client to the Web Socket. It is the counterpart of the CinemaEventSocketMediator that is more or less the host or server for the web socket. The constructor for this class has two important steps: through the ContainerProvider (Provider class that allows the developer to get a reference to the implementation of the WebSocketContainer) a reference to the WebSocketContainer is retrieved (this is an implementation provided object that provides applications a view on the container running it. The WebSocketContainer container various configuration parameters that control default session and buffer properties of the endpoints it contains. It also allows the developer to deploy websocket client endpoints by initiating a web socket handshake from the provided endpoint to a supplied URI where the peer endpoint is presumed to reside. ) Using this container reference, through the connectToServer method, the client endpoint MovieEventSocketClient is connectedto its server. (This method blocks until the connection is established, or throws an error if either the connection could not be made or there was a problem with the supplied endpoint class.)

Class MovieEventSocketClient has methods annotated with @OnOpen, @OnClose and @OnMessage with more or less the same role as the counterparts in CinemaEventSocketMediator (and in the JavaScript client as we will see later). Note how in the @OnOpen annotated method the input parameter of type Session is retained and how in the method sendMessage this user session is used to send a message across the web socket.

package nl.amis.cinema.view.push;

import java.net.URI;

import javax.websocket.ClientEndpoint;
import javax.websocket.CloseReason;
import javax.websocket.ContainerProvider;
import javax.websocket.OnClose;
import javax.websocket.OnMessage;
import javax.websocket.OnOpen;
import javax.websocket.Session;
import javax.websocket.WebSocketContainer;

// based on http://stackoverflow.com/questions/26452903/javax-websocket-client-simple-example

@ClientEndpoint
public class MovieEventSocketClient {
public MovieEventSocketClient(URI endpointURI) {
try {
WebSocketContainer container = ContainerProvider.getWebSocketContainer();
container.connectToServer(this, endpointURI);
} catch (Exception e) {
throw new RuntimeException(e);
}
}

Session userSession = null;

@OnOpen
public void onOpen(Session userSession) {
System.out.println("client: opening websocket ");
this.userSession = userSession;
}

/**
* Callback hook for Connection close events.
*
* @param userSession the userSession which is getting closed.
* @param reason the reason for connection close
*/
@OnClose
public void onClose(Session userSession, CloseReason reason) {
System.out.println("client: closing websocket");
this.userSession = null;
}

/**
* Callback hook for Message Events. This method will be invoked when a client send a message.
*
* @param message The text message
*/
@OnMessage
public void onMessage(String message) {
System.out.println("client: received message "+message);
}

public void sendMessage(String message) {
this.userSession.getAsyncRemote().sendText(message);
}

}

Next we extend Class MovieEvent – the REST service that receives the JSON messages as HTTP POST requests – to interact with MovieEventSocketClient to pass the JSON messages to the Web Socket.

Method initializeWebSocket is added to instantiate MovieEventSocketClient with the address of the web socket. In postMovieEvent – the method annotated with @POST that handles the HTTP POST requests – a call is added to sendMessageOverSocket that hands the JSON message to MovieEventSocketClient  (after initializing it) to send it across the web socket (where it will be received in class CinemaEventSocketMediator).

 

package nl.amis.cinema.view;

import java.net.URI;
import java.net.URISyntaxException;

import javax.ws.rs.Consumes;
import javax.ws.rs.GET;
import javax.ws.rs.POST;
import javax.ws.rs.Path;
import javax.ws.rs.Produces;
import javax.ws.rs.core.Context;
import javax.ws.rs.core.Request;

import nl.amis.cinema.view.push.MovieEventSocketClient;

//invoke at : http://localhost:7101/CinemaMonitor/resources/cinemaevent

@Path("cinemaevent")
public class MovieEvent {

private MovieEventSocketClient client;

private final String webSocketAddress = "ws://localhost:7101/CinemaMonitor/cinemaSocket";

public MovieEvent() {
}

private void initializeWebSocket() throws URISyntaxException {
//ws://localhost:7101/CinemaMonitor/cinemaSocket/
System.out.println("REST service: open websocket client at " + webSocketAddress);
client = new MovieEventSocketClient(new URI(webSocketAddress + "/0"));
}

private void sendMessageOverSocket(String message) {
if (client == null) {
try {
initializeWebSocket();
} catch (URISyntaxException e) {
e.printStackTrace();
}
}
client.sendMessage(message);

}

@POST
@Consumes("application/json")
@Produces("text/plain")
public String postMovieEvent(@Context Request request, String json) {
System.out.println("received event:" + json);
sendMessageOverSocket(json);
return "event received " + json;
}

@GET
@Produces("text/plain")
public String getMovieEvent(@Context Request request) {
return "nothing to report from getMovieEvent";
}
}

At this point, the application can be deployed. JSON messages sent to the REST service exposed by MovieEventSocketClient should be sent onward to the Web Socket (end point), leading to a message being written to the system output from onMessage in class CinemaEventSocketMediator.

Implement the HTML and JavaScript web application for the Cinema Monitor

The final piece in the puzzle discussed in this article is the client application to present the live status for the movie theater based on the web socket messages. It runs in a relatively modern browser – all standard browsers have HTML5 support which includes Web Socket interactions – and consists of an HTML page and two JavaScript libraries.

image

 

The HTML itself is relatively straightforward and boring.

image

Important are the <script> statements that import the JavaScript libraries that interact with the web socket and handle messages received over the web socket. The four rooms in the movie theater that are being monitored are represented by four TD elements with their id values set to room1..room4. These id values will be used in the JavaScript to locate the HTML element to update when a JSON message is received on the web socket for a particular room.

The imported JavaScript library websocket.js initializes the web socket connection to the end point ws://localhost:7101/CinemaMonitor/cinemaSocket. It configures JavaScript handlers for onOpen, onClose and onMessage. The latter is the most important one: any messages received on the web socket are checked for the string room. If the string is found, the message is handed off to the function updateRoomDetails(). This function is defined in the second JavaScript library moviemonitor.js.

var wsUri = "ws://" + document.location.host + "/CinemaMonitor/cinemaSocket/5";
var websocket = new WebSocket(wsUri);

websocket.onmessage = function(evt) { onMessage(evt) };
websocket.onerror = function(evt) { onError(evt) };
websocket.onopen = function(evt) { onOpen(evt) };

function onMessage(evt) {
console.log("received over websockets: " + evt.data);
console.log("looked for room index of: "+ evt.data.indexOf("room"));
var index = evt.data.indexOf("room");
writeToScreen(evt.data);
if (index&gt;1) {
console.log("found room index of: "+ evt.data.indexOf("room"));
updateRoomDetails( evt.data);
}
}

function onError(evt) {
writeToScreen('<span style="color: red;">ERROR:</span> ' + evt.data);
}

function onOpen() {
writeToScreen("Connected to " + wsUri);
}

// For testing purposes
var output = document.getElementById("output");

function writeToScreen(message) {
if (output==null)
{output = document.getElementById("output");}
//output.innerHTML += message + "
";
output.innerHTML = message + "
";
}

function sendText(json) {
console.log("sending text: " + json);
websocket.send(json);
}

The function updateRoomDetails in moviemonitor.js does not do a whole lot. It parses the input parameter from plain text with JSON format to a JavaScript memory structure. The function handleRoomUpdate is invoked with that JavaScript data structure – an object with properties room and occupation. The function handleRoomUpdate locates the TD element with its id set to room# where # corresponds wit the room property in the roomDetails input argument. It then sets the innerHTML of this element to the value of the occupation property. The result is an instant update of the room occupation value displayed in the user interface.

function updateRoomDetails( json) {
var roomDetails = JSON.parse(json);
handleRoomUpdate(roomDetails);
}

function handleRoomUpdate( roomDetails) {
var roomId = roomDetails.room;
var occupation = roomDetails.occupation;

var roomCell = document.getElementById("room"+roomId);
roomCell.innerHTML = occupation;

document.getElementById("message").innerHTML = roomDetails;
}

 

The screenshot shows the situation after a number of JSON messages have been received over the web sockets and the user interface has been updated accordingly.

 

image

 

Resources

Download the source code for the example discussed in this article: Zip File.

The post Java Web Application sending JSON messages through WebSocket to HTML5 browser application for real time push appeared first on AMIS Oracle and Java Blog.

]]>
https://technology.amis.nl/2015/05/14/java-web-application-sending-json-messages-through-websocket-to-html5-browser-application-for-real-time-push/feed/ 0
Publish a REST service from PL/SQL to handle HTTP POST requests – using the embedded PL/SQL gateway https://technology.amis.nl/2015/05/13/publish-a-rest-service-from-plsql-to-handle-http-post-requests-using-the-embedded-plsql-gateway/ https://technology.amis.nl/2015/05/13/publish-a-rest-service-from-plsql-to-handle-http-post-requests-using-the-embedded-plsql-gateway/#comments Wed, 13 May 2015 04:35:00 +0000 https://technology.amis.nl/?p=35917 Oracle Database can act as an HTTP server – using the Embedded PL/SQL Gateway (the 10g successor of the MOD_PLSQL gateway). With just a few statements, we can have the Oracle Database become a listener to HTTP requests (GET or POST). When requests are received at the configured host, port and URL, the request is [...]

The post Publish a REST service from PL/SQL to handle HTTP POST requests – using the embedded PL/SQL gateway appeared first on AMIS Oracle and Java Blog.

]]>
Oracle Database can act as an HTTP server – using the Embedded PL/SQL Gateway (the 10g successor of the MOD_PLSQL gateway). With just a few statements, we can have the Oracle Database become a listener to HTTP requests (GET or POST). When requests are received at the configured host, port and URL, the request is passed to a PL/SQL procedure that handles it and prepares a response.

In this article, we will expose a REST service at URL http://localhost:8080/api/movieevents. This service processes an HTTP POST request that in this case contains a JSON payload. The payload is passed to the PL/SQL procedure to do with as it feels fit.

The implementation takes place in two steps. First, some preparations must be made by the DBA – to make it possible for a particular database schema to handle HTTP requests received on a certain URL. This includes opening up a certain host and port.

First, you may want to set the HTTP port:

select dbms_xdb.gethttpport
from   dual

and if you do not like it, set another one:

EXECUTE dbms_xdb.SETHTTPPORT(8080);

The following statements create the Access Control List that specifies that connection is allowed to database schema WC with HTTP requests to host 127.0.0.1 (aka localhost) and ports between 7000 and 9200:

begin

  dbms_network_acl_admin.create_acl (

    acl             => 'utlpkg.xml',

    description     => 'Normal Access',

    principal       => 'CONNECT',

    is_grant        => TRUE,

    privilege       => 'connect',

    start_date      => null,

    end_date        => null

  );

end;



begin

  dbms_network_acl_admin.add_privilege ( 

  acl         => 'utlpkg.xml',

  principal     => 'WC',

  is_grant     => TRUE, 

  privilege     => 'connect', 

  start_date     => null, 

  end_date     => null); 

  dbms_network_acl_admin.assign_acl (

  acl => 'utlpkg.xml',

  host => '127.0.0.1',

  lower_port => 7000,

  upper_port => 9200);

end;

Next, the DAD is created – linking the URL path segment /api/ to the WC database schema. This means that any HTTP request received at http://localhost:8080/api/XXX is passed to a PL/SQL procedure called XXX :

BEGIN

  DBMS_EPG.create_dad 

  ( dad_name => 'restapi'

  , path => '/api/*'

  );

  DBMS_EPG.AUTHORIZE_DAD('restapi','WC');

end;

The next line instructs the Embedded PL/SQL Gateway to return a readable error page whenever a request is not processed correctly:

exec dbms_epg.set_dad_attribute('restapi', 'error-style', 'DebugStyle');

This line associates the database user WC with the restapi url.

EXEC DBMS_EPG.SET_DAD_ATTRIBUTE('restapi', 'database-username', 'WC');

The final aspect of the preparation involves allowing anonymous access – this means that no username and password are required for HTTP calls  handled by the Embedded PL/SQL Gateway. As per Tim Hall’s instructions:

to enable anonymous access to the XML DB repository, the following code creates the “<allow-repository-anonymous-access>” element if it is missing, or updates it if it is already present in the xdbconfig.xml file.

SET SERVEROUTPUT ON

DECLARE

  l_configxml XMLTYPE;

  l_value     VARCHAR2(5) := 'true'; -- (true/false)

BEGIN

  l_configxml := DBMS_XDB.cfg_get();



  IF l_configxml.existsNode('/xdbconfig/sysconfig/protocolconfig/httpconfig/allow-repository-anonymous-access') = 0 THEN

    -- Add missing element.

    SELECT insertChildXML

           (

             l_configxml,

                '/xdbconfig/sysconfig/protocolconfig/httpconfig',

                'allow-repository-anonymous-access',

                XMLType('<allow-repository-anonymous-access xmlns="http://xmlns.oracle.com/xdb/xdbconfig.xsd">' ||

                         l_value ||

                        '</allow-repository-anonymous-access>'),

                'xmlns="http://xmlns.oracle.com/xdb/xdbconfig.xsd"'

              )

    INTO   l_configxml

    FROM   dual;



    DBMS_OUTPUT.put_line('Element inserted.');

  ELSE

    -- Update existing element.

    SELECT updateXML

           (

             DBMS_XDB.cfg_get(),

             '/xdbconfig/sysconfig/protocolconfig/httpconfig/allow-repository-anonymous-access/text()',

             l_value,

             'xmlns="http://xmlns.oracle.com/xdb/xdbconfig.xsd"'

           )

    INTO   l_configxml

    FROM   dual;



    DBMS_OUTPUT.put_line('Element updated.');

  END IF;



  DBMS_XDB.cfg_update(l_configxml);

  DBMS_XDB.cfg_refresh;

END;

The database account anonymous also has to be unlocked to truly enable anonymous access:

ALTER USER anonymous ACCOUNT UNLOCK;

 

This completes the preparations. We now have setup a DAD that is associated with the /api/* path in HTTP requests sent to http://localhost:8080/api/*. This DAD hands requests to the WC database schema to be handled. Requests do not have to include username and password.

Now we have to connect to the WC database schema in order to create the PL/SQL procedure that will handle such requests.

create or replace procedure movieevents

( p_json_payload in varchar2 default '{}' 

)

is

begin

  htp.p('call received p_json_payload='||p_json_payload);

  htp.p('REQUEST_METHOD='||owa_util.get_cgi_env(param_name => 'REQUEST_METHOD'));

end movieevents;

Between the definition of the DAD, the opening up of the port range and the creation of this procedure, we have completed the setup that will receive and process HTTP POST requests that send a body with any payload to http://localhost:8080/api/movieevents. This call will result in nothing but a simple response that describes in plain text what it received.

This opens up a bridge from any client capable of speaking HTTP to the Database – non transactional, cross firewall and without additional drivers.

Resources

Some resources:

 http://ora-00001.blogspot.com/2009/07/creating-rest-web-service-with-plsql.html

And especially Tim Hall:

http://www.oracle-base.com/articles/10g/dbms_epg_10gR2.php and  http://oracle-base.com/articles/misc/xml-over-http.php

The Oracle documentation: http://download.oracle.com/docs/cd/B28359_01/appdev.111/b28424/adfns_web.htm

On debugging and errorpage:  http://daust.blogspot.com/2008/04/troubleshooting-404-not-found-error-on.html

The post Publish a REST service from PL/SQL to handle HTTP POST requests – using the embedded PL/SQL gateway appeared first on AMIS Oracle and Java Blog.

]]>
https://technology.amis.nl/2015/05/13/publish-a-rest-service-from-plsql-to-handle-http-post-requests-using-the-embedded-plsql-gateway/feed/ 0
StreamExplorer – use REST adapter to feed results from event processing to a REST service https://technology.amis.nl/2015/05/13/streamexplorer-use-rest-adapter-to-feed-results-from-event-processing-to-a-rest-service/ https://technology.amis.nl/2015/05/13/streamexplorer-use-rest-adapter-to-feed-results-from-event-processing-to-a-rest-service/#comments Wed, 13 May 2015 04:25:23 +0000 https://technology.amis.nl/?p=35961 In a recent article, I described how StreamExplorer can be configured to consume events by exposing a REST service to which clients can send HTTP POST requests with JSON payloads. StreamExplorer also can make use of an Outbound REST Adapter through which results of explorations can be sent. This target service can be implemented in [...]

The post StreamExplorer – use REST adapter to feed results from event processing to a REST service appeared first on AMIS Oracle and Java Blog.

]]>
In a recent article, I described how StreamExplorer can be configured to consume events by exposing a REST service to which clients can send HTTP POST requests with JSON payloads. StreamExplorer also can make use of an Outbound REST Adapter through which results of explorations can be sent.

This target service can be implemented in a number of ways – that does not concern StreamExplorer. In this case we will use an implementation in a Java Servlet (JAX-RS). Alternatives are a SOA Suite service or even a Stream Explorer application with a REST based input stream. The REST service that is invoked in this example is published from a Java SE application – as described in this article. When you run the class CinemaEventHandlerRestStartup, a REST service is published at http://localhost:9998/movieevent, supporting both GET and a POST operations.

REST services can be a target for StreamExplorer – so we should be able to quickly configure this service as a target. It will not do much with any messages it receives – it will just show in the console output that it did receive them.

The StreamExplorer application that will publish results to the REST target is introduced in this article. The example looks at a movie theater. More specifically, it monitors the visitors entering and leaving the various rooms in the theater so we keep track of the number of people currently present in each room. When rooms are almost full, we may have to stop selling tickets or perhaps open up an additional room showing the same movie. When a room is [almost]empty, perhaps we should cancel the show altogether. When people keep coming and going, there may be a problem we should attend to. In this case, the events are received by Stream Explorer in the form of JSON messages posted to a REST service.

The data shape for events in the stream is defined like this:

image

From the Catalog, create a new Exploration based on the stream CinemaComingAndGoing.

image

 

Run either the SoapUI test case or the PL/SQL script to have some cinema events published. Verify that these events lead to results in the new exploration.

image

Next, click on Configure a Target. Select REST as the target type

image

and set the resource URL to http://localhost:9998/movieevent .

image

Click on Finish. Next, publish the Exploration.

image

Publish some cinema events to the REST service exposed by the StreamExplorer exploration, as described in this article. The events are processed and forwarded to the REST service (exposed in this case by a Java Class running in a plain Java SE container).

 

image

The post StreamExplorer – use REST adapter to feed results from event processing to a REST service appeared first on AMIS Oracle and Java Blog.

]]>
https://technology.amis.nl/2015/05/13/streamexplorer-use-rest-adapter-to-feed-results-from-event-processing-to-a-rest-service/feed/ 0
Use the inbound REST adapter of StreamExplorer to consume events from HTTP POST requests https://technology.amis.nl/2015/05/12/use-the-inbound-rest-adapter-of-streamexplorer-to-consume-events-from-http-post-requests/ https://technology.amis.nl/2015/05/12/use-the-inbound-rest-adapter-of-streamexplorer-to-consume-events-from-http-post-requests/#comments Tue, 12 May 2015 19:06:13 +0000 https://technology.amis.nl/?p=35945 StreamExplorer is a fairly recent product from Oracle – a business user friendly layer around Oracle Event Processor. In various previous articles, I have discussed StreamExplorer. I have demonstrated how SX can consume events from JMS, EDN and from CSV files. This article shows how a stream in StreamExplorer can expose a REST service to [...]

The post Use the inbound REST adapter of StreamExplorer to consume events from HTTP POST requests appeared first on AMIS Oracle and Java Blog.

]]>
StreamExplorer is a fairly recent product from Oracle – a business user friendly layer around Oracle Event Processor. In various previous articles, I have discussed StreamExplorer. I have demonstrated how SX can consume events from JMS, EDN and from CSV files. This article shows how a stream in StreamExplorer can expose a REST service to which events can be pushed.

image

In this case, we will look at a movie theater. More specifically, we will monitor the visitors entering and leaving the various rooms in the theater so we keep track of the number of people currently present in each room. When rooms are almost full, we may have to stop selling tickets or perhaps open up an additional room showing the same movie. When a room is [almost]empty, perhaps we should cancel the show altogether. When people keep coming and going, there may be a problem we should attend to.

In this case, the events are received by Stream Explorer in the form of JSON messages posted to a REST service. We use a SoapUI test case with requests to send in these events. The test request called aprtyOf3InRoom1, isshown in the figure:

image

The request is configured to be sent to the endpoint http://localhost:9002. That is the endpoint for Stream Explorer (and OEP and more specifically the Jetty server running the OEP domain on top of which Stream Explorer was applied). The (REST) Resource is specified as /cinema. Together this means that this request is sent as a POST request to the end point http://localhost:9002/cinema. So that is where our StreamExplorer application will have to consume the message.

The message itself has a JSON payload with two simple properties: room (to identify a room in the movie theater) and partySize (to stipulate the number of people involved in an observation). Note: the number of people is positive when a party enters the room and negative when it leaves the room.

The TestSuite TestSuiteCinema contains a single test case with a number of steps that simulate events on a slow night in the movie theater.

Create a Stream and a First Exploration for Handling Cinema Events

Create a new Stream. The name is not crucial. The Source Type should be REST.

image

 

Specify the Context Path as /cinema.

image

Define the REST Shape. Set the name to CinemaEntryOrExitEvent. Define two properties: room – of type String – and partySize – of type Integer.

image

Click on Create to create the Stream.

The Exploration wizard opens next. Set a name for the exploration – for example RoomOccupancy. You may provide a description as well.

image

Click on Create.

Configure the Exploration for example like this:

image

The events are aggregated grouped by room. The aggregation to be calculated is a sum over the partySize. This should produce the total number of people in every room. In this case, I have also chosen to have the summary calculated once every 5 seconds and to include in the results only the events from the last 8 hours (which is quite arbitrary).

At this point I create a test case in SoapUI for the REST service – and run it:

image

The exploration starts reporting its findings:

image

 

Because publishing data to a REST service in JSON format is so easy – and is supported from many tools such as SoapUI and technologies including PL/SQL and Java, it is a very convenient way of sending events to a StreamExplorer application, both for development and testing purposes as well as for a production implementation.

Using the PL/SQL procedure described in this article  we can easily send events from inside the Oracle Database to the StreamExplorer stream:

image

– and verify the effects in Stream Explorer. Note that the movie showing in room 2 must be quite awful;-)

image

 

Resources

Getting Started with the REST Adapter in OEP 12c – A-Team Blog

Make HTTP Post call from PL/SQL

Make HTTP Post call from Java

The post Use the inbound REST adapter of StreamExplorer to consume events from HTTP POST requests appeared first on AMIS Oracle and Java Blog.

]]>
https://technology.amis.nl/2015/05/12/use-the-inbound-rest-adapter-of-streamexplorer-to-consume-events-from-http-post-requests/feed/ 0
Make HTTP POST request from Java SE – no frills, no libraries, just plain Java https://technology.amis.nl/2015/05/12/make-http-post-request-from-java-se-no-frills-no-libraries-just-plain-java/ https://technology.amis.nl/2015/05/12/make-http-post-request-from-java-se-no-frills-no-libraries-just-plain-java/#comments Tue, 12 May 2015 15:21:13 +0000 https://technology.amis.nl/?p=35920 This article shows a very simple, straightforward example of making an HTTP POST call to a url (http://localhost:8080/movieevents) and sending a JSON payload to that URL. The REST service invoked in this example is the service published from Java EE as described in this article. package nl.amis.cinema.view; import java.io.BufferedReader; import java.io.IOException; import java.io.InputStreamReader; import java.io.OutputStream; [...]

The post Make HTTP POST request from Java SE – no frills, no libraries, just plain Java appeared first on AMIS Oracle and Java Blog.

]]>
This article shows a very simple, straightforward example of making an HTTP POST call to a url (http://localhost:8080/movieevents) and sending a JSON payload to that URL.

The REST service invoked in this example is the service published from Java EE as described in this article.

package nl.amis.cinema.view;

import java.io.BufferedReader;
import java.io.IOException;
import java.io.InputStreamReader;

import java.io.OutputStream;
 
import java.net.HttpURLConnection;
import java.net.MalformedURLException;
import java.net.URL;
 
 
 
public class CinemaEventGenerator  {
    /*
     * based on https://technology.amis.nl/2015/05/12/make-http-post-request-from-java-se-no-frills-no-libraries-just-plain-java/
     */
 
    private static final String USER_AGENT = "Mozilla/5.0";
    private static final String targeturl = "http://localhost:7101//CinemaMonitor/resources/cinemaevent";
 

public static void sendJson(String json) throws MalformedURLException, IOException {
	

        //method call for generating json

//        requestJson = generateJSON();
        URL myurl = new URL(targeturl);
        HttpURLConnection con = (HttpURLConnection)myurl.openConnection();
        con.setDoOutput(true);
        con.setDoInput(true);

        con.setRequestProperty("Content-Type", "application/json;");
        con.setRequestProperty("Accept", "application/json,text/plain");
        con.setRequestProperty("Method", "POST");
        OutputStream os = con.getOutputStream();
        os.write(json.toString().getBytes("UTF-8"));
        os.close();


        StringBuilder sb = new StringBuilder();  
        int HttpResult =con.getResponseCode();
        if(HttpResult ==HttpURLConnection.HTTP_OK){
        BufferedReader br = new BufferedReader(new   InputStreamReader(con.getInputStream(),"utf-8"));  

            String line = null;
            while ((line = br.readLine()) != null) {  
            sb.append(line + "\n");  
            }
             br.close(); 
             System.out.println(""+sb.toString());  

        }else{
            System.out.println(con.getResponseCode());
            System.out.println(con.getResponseMessage());  
        }  

    }


public static void main(String[] args) {
        try {
            CinemaEventGenerator.sendJson("{\"room\":\"4\" , \"occupation\":\"5\"}");
        } catch (MalformedURLException e) {
            e.printStackTrace();
        } catch (IOException e) {
            e.printStackTrace();
        }
    }
}

In this second example, the service that is being called here is exposed by a PL/SQL procedure as described in this recent article). This example does not use any additional libraries – just the standard Java SE libraries.

package nl.amis.rest;

import java.io.BufferedReader;
import java.io.DataOutputStream;
import java.io.IOException;
import java.io.InputStreamReader;

import java.io.Reader;

import java.io.UnsupportedEncodingException;

import java.net.HttpURLConnection;
import java.net.URL;

import java.net.URLEncoder;

import java.util.LinkedHashMap;
import java.util.Map;


public class CinemaEventRouter {
    /*
     * Thanks to: http://stackoverflow.com/questions/4205980/java-sending-http-parameters-via-post-method-easily?rq=1
     */

    private static final String USER_AGENT = "Mozilla/5.0";
    private static final String targeturl = "http://localhost:8080/api/movieevents";

    public static void sendPost(String json)  {
        try {
        Map<String, Object> params = new LinkedHashMap<>();
        params.put("p_json_payload", json);

        StringBuilder postData = new StringBuilder();
        for (Map.Entry<String, Object> param : params.entrySet()) {
            if (postData.length() != 0)
                postData.append('&');
            postData.append(URLEncoder.encode(param.getKey(), "UTF-8"));
            postData.append('=');
            postData.append(URLEncoder.encode(String.valueOf(param.getValue()), "UTF-8"));
        }
        byte[] postDataBytes = postData.toString().getBytes("UTF-8");


        URL url = new URL(targeturl);

        HttpURLConnection conn = (HttpURLConnection) url.openConnection();
        conn.setRequestMethod("POST");
        conn.setRequestProperty("Content-Type", "application/x-www-form-urlencoded");
        conn.setRequestProperty("Content-Length", String.valueOf(postDataBytes.length));
        conn.setDoOutput(true);
        conn.getOutputStream().write(postDataBytes);

        Reader in = new BufferedReader(new InputStreamReader(conn.getInputStream(), "UTF-8"));
        for (int c; (c = in.read()) >= 0; System.out.print((char) c))
            ;
        }
        catch (Exception e) {
            System.out.println("Call to "+targeturl+" failed.");
            e.printStackTrace();
        }

    }
}

The post Make HTTP POST request from Java SE – no frills, no libraries, just plain Java appeared first on AMIS Oracle and Java Blog.

]]>
https://technology.amis.nl/2015/05/12/make-http-post-request-from-java-se-no-frills-no-libraries-just-plain-java/feed/ 0
Invoke a REST service from PL/SQL – make an HTTP POST request using UTL_HTTP in Oracle Database 11g XE https://technology.amis.nl/2015/05/11/invoke-a-rest-service-from-plsql-make-an-http-post-request-using-utl_http-in-oracle-database-11g-xe/ https://technology.amis.nl/2015/05/11/invoke-a-rest-service-from-plsql-make-an-http-post-request-using-utl_http-in-oracle-database-11g-xe/#comments Mon, 11 May 2015 18:07:43 +0000 https://technology.amis.nl/?p=35912 This article is small and simple. It discusses how from PL/SQL an HTTP POST request can be made to a REST service. This particular service is exposed at http://localhost:9002/cinema and it expects a POST call. Making HTTP requests from PL/SQL is fairly simple, using the supplied package UTL_HTTP. Starting in Oracle Database 11g, some security [...]

The post Invoke a REST service from PL/SQL – make an HTTP POST request using UTL_HTTP in Oracle Database 11g XE appeared first on AMIS Oracle and Java Blog.

]]>
This article is small and simple. It discusses how from PL/SQL an HTTP POST request can be made to a REST service. This particular service is exposed at http://localhost:9002/cinema and it expects a POST call.

Making HTTP requests from PL/SQL is fairly simple, using the supplied package UTL_HTTP. Starting in Oracle Database 11g, some security constraints are in force around network interactions. This means that before from a specific database account a PL/SQL unit can make an HTTP call to a host and port, that account needs to be explicitly granted the privilege to do so. In this case, the user is called WC.

As DBA (for example user SYS) we have to execute the following statements to free the way for the WC account:

grant execute on utl_http to wc
grant execute on dbms_lock to wc

BEGIN
  DBMS_NETWORK_ACL_ADMIN.create_acl (
    acl          => 'local_sx_acl_file.xml', 
    description  => 'A test of the ACL functionality',
    principal    => 'WC',
    is_grant     => TRUE, 
    privilege    => 'connect',
    start_date   => SYSTIMESTAMP,
    end_date     => NULL);
end;

begin
  DBMS_NETWORK_ACL_ADMIN.assign_acl (
    acl         => 'local_sx_acl_file.xml',
    host        => 'localhost', 
    lower_port  => 9002,
    upper_port  => NULL);    
end; 

Through these statements, we create an Access Control List called local_sx_acl_file.xml that is associated with database user WC. The privilege that is assigned per direct is connect. One of the specific hosts that the connect privilege applies to is configured in the last statement: localhost that may be accessed on ports 9002 and above.

After these preparations by the DBA allow the WC account to start making the call outs.

Connect as user WC and create the following PL/SQL procedure:

create or replace
procedure publish_cinema_event
( p_room_id in varchar2
, p_party_size in number
) is
  req utl_http.req;
  res utl_http.resp;
  url varchar2(4000) := 'http://localhost:9002/cinema';
  name varchar2(4000);
  buffer varchar2(4000); 
  content varchar2(4000) := '{"room":"'||p_room_id||'", "partySize":"'||p_party_Size||'"}';

begin
  req := utl_http.begin_request(url, 'POST',' HTTP/1.1');
  utl_http.set_header(req, 'user-agent', 'mozilla/4.0'); 
  utl_http.set_header(req, 'content-type', 'application/json'); 
  utl_http.set_header(req, 'Content-Length', length(content));

  utl_http.write_text(req, content);
  res := utl_http.get_response(req);
  -- process the response from the HTTP call
  begin
    loop
      utl_http.read_line(res, buffer);
      dbms_output.put_line(buffer);
    end loop;
    utl_http.end_response(res);
  exception
    when utl_http.end_of_body 
    then
      utl_http.end_response(res);
  end;
end publish_cinema_event;

This procedures publishes a JSON body in an HTTP POST request to the REST service. The payload is composed using the two input parameters passed into the procedure.

Execute this anonymous PL/SQL statement to have the procedure sent three JSON payloads to the REST service.

begin 
  publish_cinema_event('2', -4);
  publish_cinema_event('1', 4);
  publish_cinema_event('3', -1);
end;

The post Invoke a REST service from PL/SQL – make an HTTP POST request using UTL_HTTP in Oracle Database 11g XE appeared first on AMIS Oracle and Java Blog.

]]>
https://technology.amis.nl/2015/05/11/invoke-a-rest-service-from-plsql-make-an-http-post-request-using-utl_http-in-oracle-database-11g-xe/feed/ 0
Keeping track of your licenses with OEM12C – reports https://technology.amis.nl/2015/05/11/keeping-track-of-your-licenses-with-oem12c-reports/ https://technology.amis.nl/2015/05/11/keeping-track-of-your-licenses-with-oem12c-reports/#comments Mon, 11 May 2015 12:00:04 +0000 https://technology.amis.nl/?p=35866 Wouldn’t be nice to get regularly informed how (in)compliant you are with Oracle licenses in an easy – centralized – way, and therefore not have to worry about visits of Oracle’s LMS – License Management Services? I think that would be nice for the most of us. Running LMS-scripts on the target databases, hosts and [...]

The post Keeping track of your licenses with OEM12C – reports appeared first on AMIS Oracle and Java Blog.

]]>
Wouldn’t be nice to get regularly informed how (in)compliant you are with Oracle licenses in an easy – centralized – way, and therefore not have to worry about visits of Oracle’s LMS – License Management Services? I think that would be nice for the most of us. Running LMS-scripts on the target databases, hosts and middleware is for now the most thorough way to get informed about possible incompliancy. Or in some cased,  using some clever – but informal and mostly incomplete – scripting on the OEM-repository.

But… Oracle is making serious attempts to make this easier, by integrating the LMS-information  in the repository of Oracle Enterprise Manager and make this available through a couple of (BI Publisher) reports:

  • Database Usage Tracking Report
  • Database Usage Tracking Summary Report

When running these reports (Enterprise –> Reports  -> BI Publisher Reports) with OEM 12.1.0.4 out of the box, unfortunately no data will be shown. There are some manual configuration and upgrades to be done. In the rest of the post I’ll explain some hurdles you have to overcome to get this working.

By the way, it’s not unthinkable that LMS will accept the outcome of these reports as a valid source for counting the (in)compliancy on a relative short notice.

As said, this post is about the following reports:

image

The Database Usage Tracking Summary Report can be run online, the Database Usage Tracking Report must be run through a scheduled job and the output will be sent to a ftp-server.

The steps to get output from these reports:

  • The reports are BI Publisher reports. Although BI Publisher is the primary reporting tool for OEM12c and ships standard with OEM12c, it has to be configured to work properly.

The script ‘configureBIP’ has to be run, explained here. Be aware, Oracle Enterprise Manager will be restarted.

Note: BI Publisher may be used with OEM12C as a so-called ‘restricted license’. In short: free of charge as long as you use BI Publisher just for reporting the repository of Enterprise Manager.

  • Database usage tracking credentials has to be set, just follow the instructions in the documentation. Note: this has to be done per database target, and the authorization is a kind of bizar:

image

This will be a next target for me why and if this is needed.

  • Enabling the metric collection through monitoring templates, according to the same documentation mentioned before.
  • Configuring a ftp-server via this same documentation (part : ‘generating database usage tracking report), only needed for the Database Usage Tracking Report. For the purpose of this post, I used the same host where OEM has been installed.

So far the technical documentation. It should work now, or is it? Almost.  A few additional requirements are needed:

  • The repository view SYSMAN.MGMT$DB_FEATUREUSEAGE is populated when the metric ‘Feature Usage’ has been enabled. This is well explained in note 1970236.1.
  • The following tables should be populated too when enabling this metric, but weren’t at 12.1.0.4.0:

image

  • Installing  patch PSU3 was needed to get this right. Be aware: also OPatch needs to be upgraded. I used 11.1.0.12.6, to be found on the latest OPatch download-page. Don’t use the 12-Opatch, you get something like ‘cannot run as root’…

After that, the output of the Database Usage Tracking Summary Report looks like this (the database of the repository is shown here, for this is my lab-environment):

database_usage_tracking_summary1

 

Then, the report Database Usage Tracking Report has to be scheduled through BI Publisher ‘Schedule Report Job':

databaes_usage_scheduling1

The output of the Database Usage Tracking Report shows like this and looks a lot like some output we had to send to LMS (excel-sheets) when executing the LMS-scripts on the target database:

database_usage_tracking1

O.k. It works. But now what? Is this the right information, and what is the incompliancy in relation to what you’re entitled to use (NUP’s, processor based, etc.).

Regarding the first question; is this the right information?

I configured and ran this report on a virtual box, and the only database target is the repository database itself. Based on this information the output of the ‘summary’ report is not correct, and probably the detail-report neither. But this conclusion is not fair, and it can only be answered when comparing customer LMS-data against the output of the script. I think Oracle is serious about this and undoubtedly they will get right as soon as possible  (need one or two patches I think).

The next step for me will be to ask a few customers if they are willing to configure their Enterprise Manager the right way, and run on the other hand LMS scripts on their target databases in order to compare the outputs. I feel a new blog post coming up….

Regarding the second question: ‘what is the incompliancy':

When the information is correct, you will always need somebody to interpret the data in relation to the use (production, development etc.), and put it next to what you’re entitled to with Oracle. So it’s not the script that shows unquestionably how much incompliant you are. But It’s definitely a step forward compared with running the LMS scripts on every target.

Regards…

 

Sources:

Configuring BIP: http://docs.oracle.com/cd/E24628_01/install.121/e24089/install_em_bip.htm

Configuring Usage reports: http://docs.oracle.com/cd/E24628_01/doc.121/e24473/usage_reports.htm#EMADM13567

OPatch download page: https://updates.oracle.com/download/6880880.html

The post Keeping track of your licenses with OEM12C – reports appeared first on AMIS Oracle and Java Blog.

]]>
https://technology.amis.nl/2015/05/11/keeping-track-of-your-licenses-with-oem12c-reports/feed/ 1