Quantcast
Channel: SCN : All Content - SAP Manufacturing Integration and Intelligence (SAP MII)
Viewing all 1775 articles
Browse latest View live

SAP MII Illuminator Document XSD

$
0
0

There are always uses and questions that come up around the usage of XSDs, especially when working with the MII transaction WSDL interface and trying to capture XML or return XML values using this.  The default behavior of the XML type when displayed in an MII transaction WSDL is to show the type as a String rather than as having a type of XML.  This is because technically there is no difference between an unstructured XML and a String.  If you want to provide a structured XML type you do have the option to define in the transaction properties window an XSD and choose the "Element Name" that represents the structure of the expected XML document as shown here:

TrxXMLPropertyConfig.png

Simply press the "Select XSD" button to open up the below configuration dialog to choose your XSD.

TrxXSDSetup.png

 

From here you have the option to or not to enforce validation of the XML in MII (MII will throw an error if XML doesn't match XSD).

 

The XSD used in the above is this and this XSD definition is for the standard MII Illuminator XML format (/Rowsets/Columns/Column & /Rowsets/Rowset/Row):

<?xml version="1.0" encoding="utf-8" standalone="no"?><s:schema xmlns:s="http://www.w3.org/2001/XMLSchema">    <!-- definition of complex elements -->    <s:element name="Messages">        <s:complexType>            <s:sequence>                <s:element maxOccurs="unbounded" minOccurs="0" name="Message" type="s:string" />            </s:sequence>        </s:complexType>    </s:element>    <s:element name="Column">        <s:complexType>    <s:attribute name="MinRange"    type="s:decimal" />            <s:attribute name="MaxRange"    type="s:decimal" />            <s:attribute name="SQLDataType"  type="s:integer" />            <s:attribute name="Name"        type="s:string" />            <s:attribute name="SourceColumn" type="s:string" />    <s:attribute name="Description"  type="s:string" />        </s:complexType>    </s:element>    <s:element name="Columns">        <s:complexType>            <s:sequence>                <s:element maxOccurs="132" minOccurs="1" ref="Column" />            </s:sequence>        </s:complexType>    </s:element>    <s:element name="Row">        <s:complexType>            <s:sequence id="RowSequence">                <s:any maxOccurs="unbounded" minOccurs="0" namespace="##local" processContents="lax" />            </s:sequence>        </s:complexType>    </s:element>    <s:element name="Rowset">        <s:complexType>            <s:sequence>                <s:element maxOccurs="1" ref="Columns" />                <s:element maxOccurs="unbounded" minOccurs="0" ref="Row" />            </s:sequence>        </s:complexType>    </s:element>    <s:element name="Rowsets">        <s:complexType>            <s:sequence>                <s:element maxOccurs="1" minOccurs="0" name="FatalError" type="s:string" />                <s:element maxOccurs="1" minOccurs="0" ref="Messages" />                <s:element maxOccurs="unbounded" minOccurs="0" ref="Rowset" />            </s:sequence>            <s:attribute name="DateCreated" type="s:dateTime" />            <s:attribute name="EndDate" type="s:dateTime" />            <s:attribute name="StartDate" type="s:dateTime" />            <s:attribute name="Version" type="s:string" />        </s:complexType>    </s:element></s:schema>

From here when you do call the WSDL generator interface for the MII Transaction engine (Web Service Interface - Content Development - SAP Library) the response now includes a "typed" element and also at the top includes a reference to your XSD:

WSDLDefinition.png

 

Hope that this helps,
Sam


Issue with SAP_ME_PAPI Interface block

$
0
0

Hello Experts,

 

Greetings!!

 

I am facing an issue with SAPME_PAPI_Inteface block. This block is a part of custom action jar file that is deployed on our MII server. This jar file is used for ME-ERP integration purpose.

 

Description of issue:

 

In MII transaction, I click on configure object of SAPME_PAPI_Interface block.
Then mention SAP ME site name and credentials and click on search API's. I am able to fetch list of API's in SAP ME.
After selecting the API, I click on OK. When prompted to generate request/response documents, I click on yes.

 

Now moving to Configure links.
On expanding the request/response nodes, I should get complete hierarchy of nodes. But here, I am getting Test nodes only.
I am attatching screenshots for your reference.

me_papi1.png

 

Other Details:

1. SAP MII 14.0 SP05 Patch7

2. SAP ME 6.1.4.0

 

Is ther any thing I am missing?

Please help me with this issue.

 

Regards,

Minakshi.

MDO Query data for i5Chart

$
0
0

Hi,

 

I want to screen MDO data with an i5Chart. When I let the filter expression of the mdoQuery clear it works fine. But when I filled the filter expression of the MDO Query with a Start Date and End Date the i5Chart no longer works but the MDO Query shows me the right data. In this case in the LogViewer appears the warning 'time engine Format mismatch. Using template default.'. Where is my mistake?

 

We using the SAP MII 14.0 SP4 Patch 2

 

Regards,

 

Matthias

BAPI_PROCORDCONF_CREATE_TT will not commit

$
0
0

Hi,

 

I am having a strange issue.

when posting with BAPI_PROCORDCONF_CREATE_TT and setting the commit flag on the configuration screen, or by setting the commit as part of the configuration link, "Boolean true", everything executes getting back a message confirmation saved, but it is not committing the posting.

 

Running MII 14 sp2, ECC 6, not sure what level but the MII is a new install and ECC has been existing for a while.

 

Any idea what is going on?

file runner action block

$
0
0

hello everybody,

I'm working with sap MII 14.

I deployed the File runner custom action block for call a external command in my MII server.

I need call a java class with command "java - jar param1 param2".

I tried to call from promt in MII server and everything work fine.

When i call from MII workbench i have this error:

 

  

  • [ERROR] Invalid link expression in action [ExecuteACommand_0]: "-jar param1 param2"
     
  • [ERROR] [ExecuteACommand_0] Link ('ExecuteACommand_0.Arguments' [Assign] from ""-jar param1 param2"") execution threw an exception. Exception: [while trying to invoke the method com.sap.lhcommon.expressioneval.Expression.evaluate(com.sap.lhcommon.expressioneval.ActionObjects, com.sap.lhcommon.common.NamedSet) of a null object loaded from field com.sap.xmii.bls.executables.links.AssignLink.expr of an object loaded from local variable 'this']

 

If I use just one parameter in Arguments (example -version) everything work fine. But when i send more than 1 arguments i have this error

 

Any suggestion?

Other way to solve my problem?

 

thanks in advance

 

Alex

SAP MII & ESP Integration Guide (Technical)

$
0
0

Overview

I am often asked how the SAP Event Stream Processing (ESP) engine and the SAP Manufacturing Integration & Intelligence (MII) applications can be used together to provide value for various customers and different use-cases that they have.  I did write a BLOG about this topic from a business perspective here (http://scn.sap.com/community/manufacturing/mii/blog/2014/09/02/streaming-data-in-the-sap-industrial-space-miipcohanaesp) that highlights how to leverage the integration between these products to drive maximum value and coverage of the various use-cases.  As you probably know there are many deployment options for both the SAP MII and ESP products and more recently the deployment on a central HANA database is now an option for both of these products.  Each product has something that it's really good at and the below integration highlights the strengths and weaknesses of the products and shows how they complement each other in a joint landscape.

 

What SAP Event Stream Processor (ESP)

In short ESP is really good at inline data analytics so as the data arrives to the stream it's waveform (sliding window typically) is instantly correlated with other various waveforms to identify patterns in the data.  It can also stream this data, in tremendously large volumes, to HANA database tables for longer-term predictive and operational analytics.  It has it's origins in the financial sector for high-frequency trading and trading analytics but we believe that there is value in this engine for driving in-line process analytics and controls visibility to operations folks for live/real-time process improvements (See BLOG http://scn.sap.com/community/manufacturing/mii/blog/2014/09/02/streaming-data-in-the-sap-industrial-space-miipcohanaesp).

 

What is SAP Manufacturing Integration & Intelligence (MII)

The SAP MII product is very good at providing local operations KPI views with SAP ERP context around them and around the data and it also has a product called SAP Plant Connectivity (PCo) bundled along with it that interfaces directly to the various Operations Technology (OT) systems.  Combining these together you have a very powerful ERP centric view of your live and historical operations data for driving live performance management and KPI data to improve processes in real-time.  This view of data can be pushed to ERP to drive live insight in ERP on what is happening at various plants and locations or to local operator dashboards and screens to further drive operations visibility and capture additional context around events from operators.  This type of data can then be loaded into the local MES or central ERP/HANA instances or both in a seamless, to the systems & workers, manner to ensure consistency of data and information.

 

Architecture

The SAP MII, PCo, ESP, and HANA products all have various technical features that can be jointly leveraged to drive local and central OT data but with ERP context so that the data can be correlated in-line and also after the fact once it's stored in the HANA database tables.  Having context to various event points also enables ad-hoc analysis of the detected events to drive operations improvements across multiple systems simultaneously.  The most common architectural scenarios for integration MII/PCo and ESP together is to stream live data from PCo directly to ESP in high-volumes and then to also include lower volume data (such as data from an operator or an MES/ERP system) into the stream in parallel to the OT data.  Then using the context provided by MII/PCo to the data to feed the "Enriched Data Stream" of ESP you have the ability to correlate the data across streaming inputs as it happens.  Conceptually the interface looks like this:

MIIPCoESP_ConceptualArch.png

What happens is that the live operations data needs to be correlated with live execution and enterprise information to have the proper identification fields but also needs to know the additional conditions that the assets are operating in to determine the proper course of action, if any, that needs to be raised as an event and pushed to MII for reporting or process improvement recommendations or to HANA for the predictive analytics and multi-asset/distributed asset comparisons.  This is all done agnostically of the underlying historian/SCADA/DCS/HMI vendor because it's using the SAP MII and PCo products to achieve this.  On a more technical level the following diagram shows the details behind the various interfaces between the different SAP products and how they are used:

MIIPCoESP_TechnicalLandscapeWithProtocols.png

 

There are many ways to leverage the above interfaces to drive process improvements and visibility to live operations data in-line with the acquisition or after the fact in a larger analytical HANA model across huge time frames and assets.  Again this is all possible because of technology improvements to the SAP MII product for mapping in ERP data to the various tag sensor data points using the Plant Information Catalog (PIC) and would continue to improve with the "Remote Administration & Configuration of PCo" topic that is currently in the customer Co-Innovation process but is planned to support pushing meta-data from the PIC to the notification streams of the PCo engine to address specifically the requirement of BuCM that is shown in the above diagram.

Something Important to Keep in Mind

When people ask me about this scenario I am always very pleased to hear that they have given a lot of thought and are also very eager to implement this  scenario to bolster their existing analytics capabilities.  I am also confronted with a common question around reliability of the data and how accurate the streaming engine is for loading data to HANA.  We are aware of the fact that sometimes in the SCADA, DCS, or Tiered Historian layers data gets buffered so that the "Current' values do not update until the buffered data moves through the network but only then if your hooked to a historian will the most recent, or Current, data value be updated and that is what will appear in the stream.  So this streaming feed behaves more like a video feed rather than a guaranteed delivery/replication mechanism of the historian data.  There are ways that are built into the ESP engine to detect the flat-lining of tag data which in ESP Terms is called "Data Aging".  This is a common scenario in the financials sector as well and why it's already handled by ESP to detect this occurs.  From here ESP can send a "Data Retrieval" style message to MII which can then synchronously query the Historian Event Table for all of the events missed during the detected flat-lining or data aging time period and then load the HANA tables with the missing data.  The scenario looks like the below in case you are a more visual person:

MIIPCoESP_MissingDataRecovery.png

The above scenario is a very common one to come across in the real-world especially when working with a large number of distributed assets but it's important to know that this scenario only needs to be configured for your particular business rules and not implemented as a custom development project.

 

Configuration of SAP ESP

In order to setup and use the above scenario a stream interface in the SAP ESP product has to first be defined to accept the high-volume data from SAP PCo and also another lower volume MII interface so that MII can push MES/ERP/Operator data into the streams.  For now we will stick to the PCo to ESP interface as the MII interface to ESP is covered here already (SAP MII & Sybase ESP Publish Actions | SCN).  In the SAP ESP environment to define a stream first open the ESP Studio (additional details and help are available here: SyBooks Online) connect/login to your ESP server and create a new project and workspace using the SAP Sybase ESP Authoring perspective you can create very complex and interactive streaming and analytical flows like this:

MIIPCoESP_ESPStreamDefinition_Complex.png

 

The above flow enables the ESP engine to sort out various streaming data feeds across multiple PCo notification streams, correlates the enriched streams together, and then generates a severity and notification payload for MII as the data is streamed in from the tags (100ms per data point per tag).  Then the notifications are fed to the MII Messaging Services interface (HTTP) and stored in the MII KPI Framework so that event frequencies can be stored and reported on as they happen to give operators insight into the process.  This is the foundation for Short Interval Control (SIC).  The Post to MII operation in the above ESP Stream definition simply authenticates (Using Basic Authentication) and posts an XML message to the MII Message Listener, Message Listeners - Message Services - SAP Library, using this URL:

http://nvpal771.pal.sap.corp:50000/XMII/Illuminator?service=WSMessageListener&mode=WSMessageListenerServer&Session=false…

 

and is done when any "event" with an assigned severity is detected by the ESP engine across the streams as the live data flows into the engine.  Here are some example snippets of how to handle and manage the PCo data:

/** Write PCo events to a file for replay purposes -- starts manually */
ATTACH OUTPUT ADAPTER oaPCoEvents TYPE toolkit_file_csv_output TO isPCo GROUP asgNoStart 
PROPERTIES dir = 'D:/SAP/ESP/workspace/pco1/data' ,
 file = 'isPcoSimulator' ,
 csvPrependStreamNameOpcode = TRUE ,
 csvHasHeader = TRUE ;
/* Simulator input stream to test input without running PCo */
/*
ATTACH INPUT ADAPTER isPCoSimulator TYPE toolkit_file_csv_input TO isPCo GROUP asgNoStart 
PROPERTIES csvExpectStreamNameOpcode = TRUE ,
 dir = 'C:/ESP5.1SP04/workspace/pco1/data' ,
 file = 'isPcoSimulatorSmall.csv' ,
 csvHasHeader = TRUE ;
*/
/** Send XML alert to MII MessageService using HTTP POST */
ATTACH OUTPUT ADAPTER POST2MII TYPE toolkit_http_output TO Alerts2MIIXML PROPERTIES 
 bodyCharset = 'UTF-8' ,
 retryNumber = 1 ,
 bodyColumn = 1 ,
 requestUrl = miiMessagePostUrl ;
ATTACH OUTPUT ADAPTER Tags2HANA TYPE hana_out TO isPCo PROPERTIES service = 'MIIServiceHANA' ,
 sourceSchema = 'RTES' ,
 table = 'TAGDATA' ,
 outputBase = FALSE ,
 dataWarehouseMode = 'INSERTONLY' ,
 timestampColumnName = 'HANATimestamp' ,
 maxReconnectAttempts = 5 ,
 maxQueueSize = 32768 ;
ATTACH OUTPUT ADAPTER HANA_Output1 TYPE hana_out TO Alerts2MII PROPERTIES service = 'MIIServiceHANA' ,
 sourceSchema = 'RTES' ,
 table = 'MIIALERTS' ,
 dataWarehouseMode = 'INSERTONLY' ,
 timestampColumnName = 'AlertTimestamp' ;
ADAPTER START GROUPS asgNoStart NOSTART;

Configuration of SAP Plant Connectivity (PCo)

Once a stream like the one above is completed or at least the inputs to the stream are defined you can define an ESP Destination in the SAP Plant Connectivity (PCo) product; also outlined in the help documentation here Sybase ESP Destination - SAP Plant Connectivity - SAP Library.  This destination can then be used and re-used by any number of notifications defined per agent instance inside of PCo.  Here is a screen capture of how to set this up and correlate the entries between the ESP and PCo Destination screens:

MIIPCoESP_PCoDestConfiguration.png

 

Once you think that you have it setup properly you can press the "Test Connection" button and if everything is deployed and running on the ESP server then the results should look like this:

MIIPCoESP_PCoDestinationSetupAndTest.png

 

The next step is to configure what tag data and how frequently it will be pushed into the ESP Destination stream.  To do this setup a PCo agent against your tag source system, define your subscription item tag for the agent, and then create a notification definition.  For the trigger logic I defaulted this to Always so that anytime the tag value changes the new value is sent to the ESP Stream.  Then in the payload of the notification which is defined in the Output tab I defined some meta-data about the stream (Later this will be managed by the Remote Administration & Configuration of PCo feature) as shown below to enhance the context of the streaming data:

MIIPCoESP_OutputConfiguration.png

Finally, the tag data, reading timestamp, quality secondary data, and the context defined in the Output tab have to be mapped to the input of the ESP Stream interface and this is done in the Destinations tab and in my example looks like this:

MIIPCoESP_DestinationConfiguration.png


Once the agent is started data will begin to flow to the ESP Stream (visible in the ESP Studio) and the streams will also send raw data to the specified HANA DB tables and the detected events to the SAP MII KPI Engine for driving operations reports.  The next step is to verify that the streaming data is in fact making into the right ESP stream and to do you open up the ESP Studio, connect to the project and open up the stream viewer and verify new data is continuously appearing in the stream:

MIIPCoESP_ESPStudioStreamViewer.png

 

Once you have verified that the data is in fact coming into the stream it's possible to then verify that the ESP Stream is sending the raw data to the HANA tables and the detected event data to the MII Messaging Services layer.

 

Configuring MII to Accept ESP Event Notifications

The SAP MII Messaging Services layer was used here for message monitoring purposes but you could just as easily called the MII Transaction Runner Servlet (Transaction Calls Using URLs - Content Development - SAP Library) in place of the messaging services.  The Messaging Services layer does however have a nice monitoring and management layer which helps with the demonstration of the flow of the live notification data from ESP.  First you need to setup a message processing rule to handle the XML payload and setup how the messages will be processed but it's easy to do.  From the MII Menu -> Messaging Services -> Message Listener we have already imported and configured the expected XML Payload from the ESP engine in order to automatically identify what the message type and id is for MII to handle:

MIIPCoESP_MessageListenerConfig.png

 

As you can see we have mapped the message Name and ID fields based on the contents of the XML payload and the message name is what is used to classify how the message is process by the Message Processing Rules and the ID is used to guarantee uniqueness of the processing and to assign a unique identifier to the detected event from ESP.  The associated Message Processing Rule was setup to send all messages with the name of "ALL_HIGH_TAG_VALUES" meaning there are lots of upper threshold violations and need to be handled with additional logic and then below it was a generic rule that processes all of the other messages in the same way to the MII KPI Engine to persist and report on their frequency over the specific time KPI interval.

MIIPCoESP_MessageProcessingRules.png

 

The transaction used to handle the incoming ESP messages is very simple and it looks like this as it's only job is to take the ESP XML Message and load it into the KPI Object:

MIIPCoESP_HandleESPMessage.png

 

The KPI Object definition looks like this and is used to identify possible problems with a given process for enablement of short-interval control (SIC) and other Kaizen initiatives:

MIIPCoESP_KPIDefinition.png

 

Once the data is loaded into the KPI object it is possible to visualize this on a web page and slice and dice the event occurrences based on the dimensions of the KPI (interactive chart).  Here is an example of what this looks like:

SAP_ESP_KPI_Image.png

 

This provides a great way to single out areas of concern by looking at event counts by "severity and type" of the events compared to what is expected to be there.  This could also be split across operational data dimensions, such as a material code.

 

Conclusion

I hope that the above technical architectural reference diagram and example scenario show how to implement the technical scenario of integrating the ESP/MII/PCo/HANA products together can provide a lot of value to local and enterprise people for short-interval control and for long term multi-asset analytics on both health and operations performance.

If you have any questions or would like further details on any point raised in this Document please let me know via the Comments section below.

Thanks,
Sam

How to get the "Write file" error message ?

$
0
0

Hi

I have an action block WriteFile that is working fine. But when this action block receives an invalid file path, it throws an exception and an error message in the server log. So I put a Catch action block to get the error message and handle it, the issue is that the error message is not setted in the catch action block.

 

I need the message that I found in the server log to send to the caller:

"/usr/sap/MID/integracao/ETON/DEV/ETON_RECEBE/_ETON_FICTICIO_1/OPLIST.ASC (No such file or directory)"

 

 

Capturar.PNG

 

Capturar2.PNG

Download OEE for MII

$
0
0

Hi

 

Please how can I download the OEE for MII 14.0 ? I would like to know this tool.


SAP MII, HANA/ESP, and Lumira (Business)

$
0
0

Executive Summary

The ecosystem of products at SAP is continually evolving and expanding to cover new and exciting new areas everyday.  The thing that they all have in common is that they are based on the use of the HTML5 UI technology which makes them extremely flexible and user friendly for the end-users.  However, not all data exists or originates from the enterprise and more and more data is being generated from the machines that make up the operations layer (especially now that automation sensors and controls have come down so far in price).  The question and challenge for the business is where do you invest in upgrading existing or replacing equipment to provide the largest ROI or prevent the next big unplanned outage from occurring.  The adoption of new thinking methodologies like Industry 4.0 or Smart Manufacturing are founded on the linkage between the enterprise and operation(s) layers.  As this linkage matures away from non-integrated approaches of manually aggregating and reporting offline data to fully automated, real-time, and web/cloud accessible data on demand will quickly identify the holes in your data.  This realization of what you actually have and what you need is what will drive re-investment in your operations technology and in turn this will accelerate the efficiency of your overall business from the ground up.

 

I have also recently completed and architectural deployment options document on how and why various deployments of MII in a HANA/ESP/Lumira environment make sense and how MII fits into your overall organization; the link is here: SAP MII, HANA/ESP and Lumira (Technical)

For a more technical view of "how-to" directly leverage your MII Content inside of the Lumira product see my technical BLOG here: SAP Lumira and MII (How to)

 

Introduction to the Products

The realization of end to end flows of data that not only drive workflows and business processes but also provide insight into priorities and live data from manually collected data enable decisions to not only be made faster but with greater confidence and transparency across the organization.  However, the data itself has to have linkage and common terms and methodologies to get at it in an intelligent and self-service way.  This means it has to be available to business and power users so that the IT department can keep up with all of the various reporting demands.

 

What is SAP Lumira

SAP Lumira (Data Visualization | Business Intelligence & Analytics) is the primary tool from SAP to explore big and small data in your enterprise.  This self-service data visualization software makes it easy to combine data from multiple sources, visualize it, analyze trends, and share insights on the BI platform or in the cloud.  It's primary use case is around visualization of HANA data but it can be easily extended to reach into other systems to retrieve data and this includes the SAP MII product.  There is a vast ecosystem already around this product and is available here: http://scn.sap.com/community/Lumira

What is SAP MII

SAP Manufacturing Integration & Intelligence (MII: Manufacturing Integration and Intelligence) is the fastest and easiest way to link manufacturing processes with business operations to enable collaborative manufacturing and get the visibility you need to run your business in real time.  It's primary use cases are to service-enable the existing plant operations systems in a non-disruptive way to locally report and automate the integration between the enterprise back-end and shop-floor environments.  It has been around SAP for almost a decade now and has wide ranging usage across every industrial vertical and region that SAP covers.  A simple diagram of what MII is and how it's commonly used is here:

SAPMIIDiagram_Simple.png

 

With some of the newer releases of the SAP MII product we have included additional features and functionality around UIs (http://scn.sap.com/docs/DOC-55991) and their development, but also around how the MII engine can expose data generically to other SAP products, like Lumira, and also to third party ISVs (like Microsoft: SAP Enterprise Manufacturing Intelligence (EMI) Solution for Microsoft Office 365).  Exposing manufacturing operations data can often pose challenges unless there is a common platform for generically exposing these heterogeneous and vertical systems.  As you found this document you should already know that there is an enormous community around the SAP MII product and the main landing page for this is here: http://scn.sap.com/community/manufacturing/mii

Use Case Scenario

When the Lumira product was first introduced inside of SAP we went over various ways that it could be integrated across and enterprise and linking live manufacturing/operations data with an enterprise UI tool was one of our primary scenarios.  This on-demand access to live manufacturing/operations data means that big data from HANA can be quickly combined with or drilled into the raw/live data at one or many manufacturing plant locations.  This provides powerful insights into manufacturing performance.  The full power of this tool has yet to be realized by many, but those out there who have started to use this have quickly realized it's potential.  To take this one step further, what if you combine asset health data from the existing CBM RDS (http://scn.sap.com/community/manufacturing/mii/blog/2014/09/09/let-s-set-the-record-straight-on-the-cbm-rds) solution to drive data into HANA around Asset Health monitoring.  Then you start to run and utilized our predictive modeling libraries inside of SAP HANA (PAL, R, & Infinite Insight) on this vast set of operations data that can be continually streamed into the HANA database (http://scn.sap.com/community/manufacturing/mii/blog/2014/09/02/streaming-data-in-the-sap-industrial-space-miipcohanaesp).  From this set of data (big data) you can drive and color code common failure points in various key assets and potentially visualize them in our 3D modeling product; visual enterprise.  Alongside of this you can quickly identify assets (and their location in your business) that are 'likely' to fail and require maintenance and reach into the live manufacturing operations environment to visualize the current operational usage and conditions for that asset.  Sounds pretty far out right, well this is maybe not as far away as you think...it is actually quite a common scenario that we hear from our customers, so common that folks within SAP have already done this; see below:

Lumira_VE_MII.png

The above example combines all of the data and capabilities outlined so far in order to provide an integrated and holistic view across various systems and types of data in order to provide information to the end-user.  An important concept to gather from the scenario is that the components involved in this type of top-floor to shop-floor data analytics story are the same regardless if the topic is around asset health, quality, or operations/production forecasting.

 

I hope that you enjoyed this BLOG and please let me know if you have further questions or comments.


Thanks again,

Sam

MII 14.0 performance

$
0
0

Hi

 

We are going to suggest MII 14.0 version to our client. Just want to know the following about MII 14.0 version.

 

1) Performance and Stability

2) Any known bugs in 14.0

3) Any compatibility issues

4) When is next version (15.0) will be available

 

Thanks in advance

Shaji

i5Grid getSelectedCellValue() gives incorrect values after filter or sort.

$
0
0

Hi Experts,

 

I have a few i5Grids with the selection envent enabled.

 

the grid has the selection event enabled. Whcih works fine on the grid as is.

 

But if the user filters or sorts the rows and selects the values, the select cell value returns the values of the rows that were originally in that position.

 

Eg:

 

In the original grid I have 40 rows with where the row I am looking for is in the 20th position and a random row at third position.

 

I filter/sort the row so that the 20th row now moves to the third position.

 

Now when I select the 20th row which is now on the third position, The values returned by the selection event are of the original third row before sorting.

 

But I expect to get values of the 20th row which has now moved to third position.

 

Please help. I am using MII 14.0 SP05

 

Regards

Maaz

Getting started with Viz Charts

$
0
0

Dear All,

 

When someone joins in the UI5 clan, one of the very first area of interest observed is charts and dashboards. So here's a simple demonstration of how one can create charts for a dashboard.

 

Prerequisites:

1. A raw data set

2. Flattened data sets(one each for a chart)

 

Steps:

1. Create a raw data set like:

var oData1 =  {TechKPI:[{date:new Date("2014-02-01T09:28:56"),TotalAvailability:87.3, WindAvailability:33.3, EngineAvailability:81.1, TotalEFOR:12.2, WindEFOR:5.2,EngineEFOR:15.9},

  {date:new Date("2014-02-02T09:28:56"),TotalAvailability:87.4, WindAvailability:34, EngineAvailability:81.1, TotalEFOR:11.5, WindEFOR:1,EngineEFOR:16.3},

  {date:new Date("2014-02-07T09:28:56"),TotalAvailability:87.8, WindAvailability:33.1, EngineAvailability:81.5, TotalEFOR:13.7, WindEFOR:8.7,EngineEFOR:15.7},

  {date:new Date("2014-02-15T09:28:56"),TotalAvailability:87.1, WindAvailability:33.1, EngineAvailability:81.4, TotalEFOR:13.2, WindEFOR:8.1,EngineEFOR:15.2},

  {date:new Date("2014-02-28T09:28:56"),TotalAvailability:87.1, WindAvailability:33.5, EngineAvailability:81.1, TotalEFOR:13.2, WindEFOR:8.1,EngineEFOR:16.4}]

};

 

2. Create a flattened data set: A viz chart needs a flattened data set to have it portrayed on the html page. A flattened dataset consists of dimensions and measure. Dimensions are mostly for axis and measures are for value trends.

 

var oDataset = new sap.viz.ui5.data.FlattenedDataset({

 

 

  dimensions:[

  {

  axis:1,

  name: "Date",

  value: {

  path : 'date' ,

                        formatter : function(fval) {

  jQuery.sap.require("sap.ui.core.format.DateFormat");

  var oDateFormat = sap.ui.core.format.DateFormat.getDateTimeInstance({pattern: "dd-MM"});

  return oDateFormat.format(new Date(fval));

 

                        }

  }

  }

  ],

 

 

 

  measures : [

 

  {

  name : 'Total Availability', // 'name' is used as label in the Legend

  value : '{TotalAvailability}' // 'value' defines the binding for the displayed value 

  }

  ],

 

  // 'data' is used to bind the whole data collection that is to be displayed in the chart

  data : {

  path : "/TechKPI"

  }

 

  });

 

Formatter function is used to display date as labels in a particular format.

 

Note: A pie chart has only one measure. Similarly for other charts, no of measures depends on the chart type.

 

3. Bind flattened data set to viz chart.

lib required: sap.viz, sap.ui.commons

 

var oLine = new sap.viz.ui5.Line({

  width:"770px",

  height:"300px",

 

  selectData: function(oEvent) {

  // get the event data as provided by the native sap.viz library

  var oSelectData = oEvent.getParameter("data");

 

  // let the dataset convert the event coordinates back to a UI5 model context

  var oContext = this.getDataset().findContext(oSelectData[0].data[0].ctx.path);

  updateChart1();

  },

  title : {

  visible : true,

  text : 'Capacity Factor (CF %)'

  },

  dataset : oDataset          //Flattened data set being referred

  });

 

4. Clone the data set: Even if your raw dataset is same and you want to say show one line and one area chart, though the structure of flattened data set is same still you cannot use same data set. This has one dataset per chart mapping. To facilitate in such scenarios, you can use clone() method to create clone of and existing dataset and bind it to a chart.

Simply,

var oDataset1 = oDataset.clone();

 

Once you have cloned data set, you can proceed as step 3.

 

Points to note:

 

1. Dimensions and measure depends on the type of viz chart.

2. One-to-one mapping between flattened dataset and viz charts. Use clone() if dealing with same data set.

 

Hope this helps!!

 

Warm Regards,

Swaroop

Average of Tags using SAP PCo Query

$
0
0

HI All,

 

I am using PCo_query and I need to average the tag values for each one hour for a day.

I have given the Duration as 1 hr in Query Template. How can I get the Average for each 1 hr.

Please let me know how to do it?

 

Regards

G.Partheeban

Repeater Action Next Row Read

$
0
0

Hello All,

 

 

I have a requirement in repeater action block.

 

I have put a repeater action on an XML return by SQL query action block. I am reading the data for each iteration.

But i have to read the data for next row when it's reading current row.

 

Suppose below is the data. When repeater is reading 2nd row data(DEF) i want to read 3rd row data (GHI) in the same iteration .

 

1ABC
2DEF
3GHI
4XYZ

 

 

Regards.

Anshul

--kiosk effect to irpt page

$
0
0

Hi,

 

I have an irpt page from which I want to open another link with kiosk effect or full screen mode(without html bar). does anybody have any idea , how to open the url form MII with kisok mode.

 

or any alternative.

 

Thanks & Regards,

Eshwar..


Problem while inserting data with more than 4000 character.

$
0
0

Hi All,

 

I am facing some issue while dealing with data having more than 4000 characters.

 

I am using SAP xMII 11.5.3 b66 and Oracle 10g. JDBC driver is u201Cojdbc14 10.2.0.5.0u201D. The datatype of column in the target table is u201CCLOBu201D (So it should accept data with more than 4000 characters).

 

Below given is the sample scenario I tried.

I have one table called u201CCLOBTABu201D and a column u201CCLOBCOLu201D with datatype CLOB.

I created a query template for inserting data into this sample table, u201Cinsert into CLOBTAB values (u2018[Param.1]u2019)u201D.

And I tried to pass the big data from front end through this u2018Param.1u2019. But it is throwing u201CERROR - java.sql.SQLException: ORA-01704: string literal too longu201D error.

 

Is there any work around available for this issue?

 

I checked the following threads, but not able to find a solution.

Oracle JDBC driver 10g and xMII: Re: Oracle JDBC driver 10g and xMII

CLOB Data Columns: CLOB Data Columns

 

Do we need to set u201CSetBigStringTryClob=trueu201D for solving this? If yes, where to set?

 

Thanks is advance.

 

Regards,

Subin

i5PieChart issue - unable to display value column value

$
0
0

Hi,

 

I am trying to generate i5Pie chart with a query template which return below data

. The query returns

     quantity confirmed

     quantity pending

     Quantity in work.

 

 

I have mapped this column as value column , but the piechart is not getting generated.

Its just display one complete circle.

However when I look at the raw data on chart , i can see the correct values.

 

Attaching the configuration and Pichart screenshot which I am getting.

 

We are using xMII 15.

 

 

Will appreciate quick help.

 

Regards

Vishal Jadhav

SAP MII and SAP ME

$
0
0

Hi Experts,

 

         I am from Manufacturing background and had Config knowledge of MM/WM/PP i recently came across SAP-MII/ME i want to know whether they are Technical modules or functional modules similar to SAP-MM as i see when i am studying online like HTML/Java based/SAP-NetWeaver etc, so do i need to learn all of coding to get myself into SAP-MII/ME.

 

And is it worth buying the SAP-MII book by SAP-press for guys like me..??

 

Please suggest me.

Connecting SAP PCo to Netweaver ABAP stack

$
0
0

Situation

In our project we have to communicate with a huge amount of Programmable Logic Controllers (PLC). Currently the project analyzes if the SAP Plant Connectivity can handle such a scenario with approximately 1.000 PLCs (or more, up to 2000) connected directly to an Netweaver  7.31 ABAP stack.

 

In our scenario we have a custom PCo agent for a special communication protocol. Any PLC will be connected to its own PCo agent. Currently each PCo agent requires approximately 30 MB of RAM (the sizing document specifies 50 - 100 MB). Each PCo agent will create 2 permanent RFC connections to the Netweaver. Additional Connections may be created for CCMS agents monitoring the PCo instances.

 

Looking at the target scenario with more than 1.000 PLCs we raise the question wether SAP PCo is the right middleware to connect Netweaver ABAP with PLCs in terms of performance, stability and monitoring.

 

Questions

 

  • Is SAP Plant Connectivity designed to deal such a huge amount of PLCs with respect to the direct connection PCo – ABAP-stack ?
  • Is there any risk/limit regarding the large amount of RFC connections to the ABAP stack?
  • Is there any recommendation for the system landscape regarding high availability and performance in such a scenario?

 

Looking forward to your answers and recommendations.

SOAP Call Executes TRX Twice

$
0
0

Hi Experts,

 

I have an SAP MII transaction that I seek to execute from .NET.  When executing it from the .NET application via SOAP, it either:  a) executes once if the transaction fails, or b) executes twice if the transaction succeeds.  What is going on?  How do I solve this?  Security seems to be the issue...

 

To replicate... 

 

I have a simple transaction with 2 input parameters and 1 output parameter.  The transaction simply adds the 2 numbers together and returns the result.  I can test using SOAP UI to remove any .NET anomalies.  I am using SAP MII 14.0 SP5 and latest version of SOAP UI.

 

Scenario 1:  Using the WSDL, I create the request:

 

<soapenv:Envelope xmlns:soapenv="http://schemas.xmlsoap.org/soap/envelope/" xmlns:xmii="http://www.sap.com/xMII">

   <soapenv:Header/>

   <soapenv:Body>

      <xmii:XacuteRequest>

         <!--Optional:-->

         <xmii:LoginName>MY SAP MII USER NAME</xmii:LoginName>

         <!--Optional:-->

         <xmii:LoginPassword>MY SAP MII USER NAME</xmii:LoginPassword>

         <!--Optional:-->

         <xmii:InputParams>

            <!--Optional:-->

            <xmii:a>3</xmii:a>

            <!--Optional:-->

            <xmii:b>2</xmii:b>

         </xmii:InputParams>

      </xmii:XacuteRequest>

   </soapenv:Body>

</soapenv:Envelope>

 

I execute it and get a response that includes an error and the dataset (below). Why do I get data AND an error?

 

HTTP/1.1 401 Unauthorized

server: SAP NetWeaver Application Server 7.20 / AS Java 7.31

...

<?xml version="1.0" encoding="UTF-8"?><soap:Envelope xmlns:soap="http://schemas.xmlsoap.org/soap/envelope/" xmlns:xsd="http://www.w3.org/2001/XMLSchema" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance">

    <soap:Body>

    <XacuteResponse xmlns="http://www.sap.com/xMII"><Rowset><Row><c>5</c></Row></Rowset></XacuteResponse></soap:Body>

</soap:Envelope>

 

Scenario 2:  If I add username, password, domain to the Request Properties then I get no error (HTTP/1.1 200 OK) and the dataset, but the transaction executed twice in SAP MII.

 

Is there a way to execute the web service as a "guest" (i.e. Scenario 1 without the 401 error)?  Or am I missing something completely here?

 

Thanks in advance,

Dave

Viewing all 1775 articles
Browse latest View live


Latest Images

<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>