v Smart data streaming adds real-time streaming analytics to the SAP HANA platform, making it easy for developers to incorporate smart stream capture and active event monitoring, alerting and event-driven response to their HANA applications.
v Analyze and act on data as it arrives. Whether its sensor data from smart devices, social media feeds, market prices, or click streams from a web site, SAP HANA smart data streaming can extract insight from that raw data, allowing you to respond as fast as things happen.
v This option is available from SAP HANA SPS 9 revision.
v The smart data streaming server runs as a separate server in the SAP HANA landscape, separate from, but interacting with, the SAP HANA database.
v Data flows into the SAP HANA smart data streaming server from external sources through built-in or custom adapters, which translate incoming messages into a format that is accepted by the SAP HANA smart data streaming server.
v This figure shows a typical smart data streaming deployment. Continuous queries, developed and tested as projects using the SAP HANA smart data streaming plugin for SAP HANA studio, are deployed to an SAP HANA smart data streaming server.
v Output adapters translate rows processed by the server into message formats that are compatible with external destinations, such as SAP HANA, and send those messages downstream.
v SAP HANA cockpit provides an operations console for configuring smart data streaming.
SAP RFC Input and Output Adapter
The SAP Remote Function Calls (RFC) adapter is both an input and output adapter. The RFC Input adapter executes RFCs to import data from SAP systems into smart data streaming, while the RFC Output adapter exports data from smart data streaming into SAP systems.
Both adapters include an adapter configuration file and a mapping file. Use the adapter configuration file to set up the RFC Input and Output Transporter modules, the smart data streaming Publisher and Subscriber modules, as well as establish a connection to smart data streaming. Use the mapping file to specify mapping definitions for RFCs, how smart data streaming stream columns map to RFC parameters, how SAP table columns map to smart data streaming stream columns, or how columns of the flattened data result set the map to smart data streaming streams.
You can either manually create a mapping file or use the schema discovery functionality in smart data streaming in SAP HANA Studio to automatically create one. See Discovering Schema and Creating a Mapping File for the SAP RFC Adapter in SAP HANA Smart Data Streaming: Developer Guide for detailed instructions on creating a mapping file in a studio.
The RFC Input adapter:
· Executes selected RFCs at predefined interval and publishes the retrieved data to streams and windows in smart data streaming
· Operates in three modes: generic RFC (includes RFC chaining capabilities), read table, and BW
The RFC Output adapter:
· Maps data from smart data streaming streams and windows to input parameters when RFCs are invoked
· Operates in generic RFC mode only
The RFC Output adapter also supports guaranteed delivery (GD). It can batch up smart data streaming data rows in the RFC parameter list and submit the data in a single RFC call. The <BatchSize> configuration parameter controls how many smart data streaming data rows the adapter batches before invoking the RFC.
Batching makes sense for an RFC that builds up table parameters from smart data streaming column data so as to cut down the number of RFC calls involved. If data is mapped to non-table type parameters, values are overwritten as new smart data streaming data rows are processed, so only use batching when data is treated as data rows in a table parameter.
The RFC Input and Output adapters report custom statistics. Enable the time-granularity option in the project configuration (ccr) file to get these statistics.
The RFC Input adapter reports these statistics:
· AdapterRunningTime
· TotalInputRowsNumber
· SuccessInputRowsNumber
· ErrorInputRowsNumber
· InputLatency
The RFC Output adapter reports these statistics:
· AdapterRunningTime
· TotalOutputRowsNumber
· SuccessOutputRowsNumber
· ErrorOutputRowsNumber
· OutputLatency
Smart Data Streaming Adapter Toolkit
Use the smart data streaming adapter toolkit to quickly build custom external adapters using Java. Adapters built using the toolkit consist of various component modules configured together to deliver data to and publish data from smart data streaming. Module types include transporters (for interacting with external transports), formatters (for converting data from one format to another), and smart data streaming connectors (for subscribing or publishing to smart data streaming).
The toolkit includes numerous transporters, formatters, and smart data streaming connectors that can be configured in various combinations by an administrator. You can also combine these out-of-the-box modules with custom modules created by a Java developer.
The adapter toolkit allows you to implement:
· An input adapter to act as a data source for the smart data streaming server
· An output adapter to act as a data destination and deliver data from streams in smart data streaming
· A dual-direction adapter to act as both a data source and data destination for smart data streaming
· Guaranteed delivery (GD) to minimize loss of data during transfer of input data
Schema discovery to automatically discover schemas for your custom input and output adapters
Features
|
Details
|
Real-time low latency response
|
Continuous dataflow computation optimized for real-time response. Typical latency between input (arrival of new information) and output is a few milliseconds
|
Highly scalable
|
Multi-threaded compute architecture scales up on larger machines, with scale out using multi-node clusters. Process a thousand events per second or millions of events per second
|
SQL-based
event processing language |
Simple and familiar, define stream processing models as continuous queries, using SQL-like statements to easily define event windows, apply filters, aggregate data, and join events to other streams or to HANA tables
|
Custom operators and functions
|
Simple inline scripting language lets you define custom operators, event handlers, and functions
|
Design time
tools in SAP HANA Studio |
Streaming plug-in for SAP HANA Studio and for Eclipse provides tools for building models and testing them. Both visual and text editors, plus stream viewer, record/playback, event input and other tools simplify model creation and deployment
|
REST and WebSocket
interfaces |
For publishing events, subscribing to output, and managing projects
|
Adapters for rapid integration
|
A range of adapters for common interfaces and data formats are included
|
Adapter toolkit and API
|
An adapter toolkit (Java) as well as an API (C/C++, Java, .NET) are included, making it easy to build custom adapter or add native connectivity to an application
|
Management via SAP HANA Cockpit
|
The streaming server(s), projects (data models) and adapters are all managed from the SAP HANA Cockpit
|
BENEFITS:
Rich streaming analytics
Apply complex processing logic, including machine learning functions, to identify patterns, calculate aggregates and detect problems by combining real time events with historical reference data to generate predictions and alerts enabling systems to act in real time.
Alerts and response
Analyze incoming messages, watching for patterns or trends. Evaluate new information in the context of other information. Detect situations and enable an immediate response.
Power live dashboards
Continuously compute key performance indicators, streaming updates, and alerts, in real-time, to live operational dashboards.
Smart Capture
Filter, aggregate or transform incoming streams of messages, capturing useful information in the SAP HANA database. Highly scalable, able to absorb data high rates.
Machine learning
Integrated machine learning capabilities in the HANA smart data streaming engine provide the ability to generate and use predictions in real time.
Internet of Things
Analyze and transform data streaming in from massive numbers of smart devices. Extract insight and respond as fast as things happen.
Can we connect to Dash Board, Reporting tools as output?
ReplyDeleteFor example, I need to stream data into Lumira, Design Studio etc.
what are other SAP reporting tools that can be configured.
Nice blog, checkout my blog streamingupdates.blogspot.com
ReplyDeleteAt Superfastprocessing, we run a multi-server configuration with high-fault tolerance and load balancers. Our platform is horizontally scalable and always stays highly available for real-time data processing needs.
ReplyDeleteVery Informative and creative contents. This concept is a good way to enhance the knowledge. thanks for sharing. Continue to share your knowledge through articles like these, and keep posting more blogs.
ReplyDeleteData Engineering Services
AI & ML Solutions
Data Analytics Services
Data Modernization Services