Thursday, November 30, 2023
HomeIoTThe right way to retailer information with AWS IoT SiteWise Edge in...

The right way to retailer information with AWS IoT SiteWise Edge in lots of areas


Introduction

On this put up, we focus on how AWS IoT SiteWise and AWS IoT SiteWise Edge can be utilized to retailer information not solely within the AWS IoT SiteWise information retailer but additionally in lots of different areas. By default information is saved within the AWS IoT SiteWise information retailer on AWS.

Prospects informed us that they wish to use AWS IoT SiteWise to gather their industrial information from OPC-UA information sources. However not all prospects wish to retailer their information solely within the AWS IoT SiteWise information retailer. On this weblog put up, we describe howto retailer information in different providers like Amazon S3, Amazon Timestream or to devour the information in prospects on-premise atmosphere.

AWS IoT SiteWise is a managed service that permits you to gather, mannequin, analyze, and visualize information from industrial gear at scale. An AWS IoT SiteWise gateway collects information from industrial gear and shops information within the AWS IoT SiteWise information retailer within the cloud.

AWS IoT SiteWise Edge brings options of AWS IoT SiteWise within the cloud to the client’s premises. You may course of information within the AWS IoT SiteWise gateway regionally, and visualize gear information utilizing native AWS IoT SiteWise Monitor dashboards served from the AWS IoT SiteWise gateway.

By default, information is saved within the AWS IoT SiteWise information retailer on AWS.

On this weblog put up, we describe how prospects can understand the advantages of the AWS IoT SiteWise Edge gateway to gather information however retailer information exterior of the AWS IoT SiteWise information retailer.

Time to learn             8 min
Studying degree          300
Providers used           AWS IoT SiteWise Edge, AWS IoT Greengrass, Amazon Kinesis Information Streams, Amazon Timestream

Resolution

Deploying AWS IoT SiteWise Edge gateway on AWS IoT Greengrass Model 2

I’m going to clarify how the AWS IoT SiteWise Edge gateway is deployed on AWS IoT Greengrass Model 2.

The AWS IoT SiteWise Edge gateway runs in type of elements on AWS IoT Greengrass Model 2. The Information Assortment Pack consists of two elements, the SiteWiseEdgeCollectorOpcua and SiteWiseEdgePublisher. The Information Processing Pack consists of the one element SiteWiseEdgeProcessor.

The Information Assortment Pack collects your industrial information and routes it to AWS locations. The Information Processing Pack permits the gateway communication with edge-configured asset fashions and belongings. You should utilize edge configuration to manage what asset information to compute and course of regionally. You may then ship your information to AWS IoT SiteWise or different AWS providers within the cloud.

The next screenshot exhibits an AWS IoT Greengrass V2 deployment with the Information Assortment Pack and Information Processing Pack deployed.

Determine 1: AWS IoT Greengrass V2 deployment

Understanding AWS IoT SiteWise gateway structure

To ship information to areas aside from the AWS IoT SiteWise information retailer, you first want to grasp the default structure of the AWS IoT SiteWise gateway.

Information is ingested into the AWS IoT SiteWise information retailer. Information is collected by the SiteWiseEdgeCollectorOpcua from OPC-UA sources and ingested into an AWS IoT Greengrass stream on the gateway, by default to the SiteWise_Stream. The SiteWiseEdgePublisher reads the information from the stream and transfers it to the SiteWise information retailer on AWS.

Determine 2: AWS IoT SiteWise gateway structure

Configuring locations within the AWS IoT SiteWise gateway to retailer information in lots of areas

To ship information to a vacation spot aside from the AWS IoT SiteWise information retailer, the gateway configuration permits you to configure the AWS IoT Greengrass stream title the place the SiteWiseEdgeCollectorOpcua shops the information. You outline the stream title for every information supply in your AWS IoT SiteWise gateway. You should utilize the AWS IoT SiteWise console, the AWS CLI or AWS SDK to configure the stream title.

You may create your personal customized stream on AWS IoT Greengrass V2 and level the vacation spot for a knowledge supply to that stream. A stream can have an export definition, which defines the AWS vacation spot to which your information will likely be transferred. Presently, AWS IoT SiteWise, AWS IoT Analytics, Amazon S3, and Amazon Kinesis Information Streams are supported as export configurations. If you export your information to Amazon Kinesis Information Streams, you could have many choices to learn the information from Amazon Kinesis Information Streams and switch it to a different service. With shoppers studying information from Amazon Kinesis Information Streams, you’ll be able to ship your information to completely different areas.

If you need for instance to retailer your information in Amazon Timestream you should use an AWS Lambda perform or Amazon Kinesis Information Analytics for Apache Flink as a shopper for Amazon Kinesis Information Streams and write the information into your Amazon Timestream desk.

With such an structure, you cannot solely retailer your information in Amazon Timestream but additionally in any location which is accessible out of your Amazon Kinesis Information Streams shopper.

In case you aren’t utilizing an export configuration for a customized stream, you’ll be able to develop your personal AWS IoT Greengrass element to devour information out of your customized stream.

Determine 3: Structure to retailer information in lots of areas with AWS IoT SiteWise

Understanding AWS IoT SiteWise Edge gateway structure

The AWS IoT SiteWise Edge gateway structure differs from the AWS IoT SiteWise gateway structure in that it consists of the SiteWiseEdgeProcessor, which lets you serve AWS IoT SiteWise Monitor portals on the edge and in addition course of information on the edge.

Determine 4: AWS IoT SiteWise Edge gateway structure

To ship information from AWS IoT SiteWise Edge to many areas you can not use the identical method as with AWS IoT SiteWise. A customized stream for a knowledge supply defines the place the SiteWiseEdgeCollectorOpcua sends the information to. The Information Processing Pack already makes use of the customized stream title SiteWise_Edge_Stream. For those who modified the stream title to your customized stream, then your information wouldn’t attain the SiteWiseEdgeProcessor.

Configure AWS IoT SiteWise Edge to retailer information in lots of areas

There are a number of choices to ship information from AWS IoT SiteWise Edge to many areas. If you do not need to ship information to the AWS IoT SiteWise information retailer it’s essential to take away the SiteWiseEdgePublisher out of your AWS IoT Greengrass deployment, as a result of the SiteWiseEdgePublisher reads information from the SiteWise_Stream and shops it within the AWS IoT SiteWise information retailer.

You should utilize the API on the edge to retrieve information and retailer it, for instance, in a stream on AWS IoT Greengrass for additional processing. This feature requires you to question the API for each single asset property, and in case your asset properties change, it’s essential to additionally change your software or the applying’s configuration.

An alternative choice is to develop a element to learn information from the SiteWise_Stream. The element transfers the information to a different vacation spot corresponding to one other stream or a goal in your on-premises atmosphere.

Determine 5: Structure to retailer information in lots of areas with AWS IoT SiteWise Edge

Within the following instance we clarify how one can learn information from the SiteWise_Stream and in a single case, ingest the information to a customized stream to be transferred to AWS, and in one other case, publish the information to an area MQTT message dealer. The customized stream is created with an export configuration to Amazon Kinesis Information Streams on AWS.

The next code snippets are based mostly on an AWS IoT Greengrass V2 element written in Python. The code makes use of the AWS Greengrass Stream Supervisor SDK for Python and the Paho Python Consumer.

The next variables are used within the customized element.

  • STREAM_NAME_SOURCE is the title of the stream to learn the information from.
  • STREAM_NAME_TARGET is the title of your customized stream the place you wish to ship the information to.
  • STREAM_NAME_CLOUD is the title of Amazon Kinesis Information Streams on AWS. The stream STREAM_NAME_TARGET is created with an export configuration to the STREAM_NAME_CLOUD.

For instance:

STREAM_NAME_SOURCE = "SiteWise_Stream"
STREAM_NAME_TARGET = "SiteWise_Anywhere_Stream"
STREAM_NAME_CLOUD = "SiteWiseToKinesisDatastream"

Earlier than beginning the element it’s essential to create an Amazon Kinesis Information Stream with stream title STREAM_NAME_CLOUD on AWS.

Upon begin, the element checks if the stream STREAM_NAME_TARGET exists. If the stream doesn’t exist, it’s created with an export configuration to Amazon Kinesis Information Streams on AWS.

strive:
    response = stream_manager_client.describe_message_stream(STREAM_NAME_TARGET)
    logger.information("stream_name: %s particulars: %s", STREAM_NAME_TARGET, response)
besides ResourceNotFoundException as error:
    logger.information("create message stream: %s error: %s", STREAM_NAME_TARGET, error)
    
    exports = ExportDefinition(
        kinesis=[KinesisConfig(
            identifier=f"{STREAM_NAME_CLOUD}",
            kinesis_stream_name=STREAM_NAME_CLOUD,
            batch_size=10,
            batch_interval_millis=60000
            )]
        )
    
    stream_manager_client.create_message_stream(
        MessageStreamDefinition(
            title=STREAM_NAME_TARGET,
            strategy_on_full=StrategyOnFull.OverwriteOldestData,
            persistence=Persistence.File,
            max_size=1048576,
            export_definition=exports
        )
    )
besides Exception as error:
        logger.error("%s", error)

The element reads messages from the STREAM_NAME_SOURCE. As soon as messages can be found it iterates over the entries in a message and begins threads to retailer the entries in a customized stream and to publish them to an MQTT message dealer.

response = stream_manager_client.read_messages(
            STREAM_NAME_SOURCE,
            ReadMessagesOptions(
                desired_start_sequence_number=LAST_READ_SEQ_NO + 1,
                min_message_count=MIN_MESSAGE_COUNT,
                read_timeout_millis=1000
            )
        )

for entry in response:
    logger.information("stream_name: %s payload: %s",
                STREAM_NAME_SOURCE, entry.payload)

   # ship information to a different stream on the edge
    thread_stream = Thread(
        goal=store_message_to_stream,
        args=[entry.payload])
    thread_stream.begin()
    logger.information('thread_stream began: %s', thread_stream)
    
   # ship information to an area MQTT message dealer
    thread_mqtt = Thread(
        goal=publish_message_to_mqtt_broker,
        args=[entry.payload])
    thread_mqtt.begin()
    logger.information('thread_mqtt began: %s', thread_mqtt)

The next perform code writes information to the customized stream STREAM_NAME_TARGET. Information ingested on this customized stream is transferred mechanically to Amazon Kinesis Information Streams on AWS.

def store_message_to_stream(payload):
    strive:
        sequence_number = stream_manager_client.append_message(stream_name=STREAM_NAME_TARGET, information=payload)
        logger.information('appended message to stream: %s sequence_number: %s message: %s',
                    STREAM_NAME_TARGET, sequence_number, payload)
    besides Exception as error:
        logger.error("append message to stream: %s: %s",
                     STREAM_NAME_TARGET, error)

The next perform code publishes information to the subject sitewise on an MQTT message dealer.

def publish_message_to_mqtt_broker(payload):
    strive:
        logger.information('MQTT: publish message: %s', payload)
        c_mqtt = paho.Consumer()
        c_mqtt.mqtt_on_publish = mqtt_on_publish
        c_mqtt.mqtt_on_disconnect = mqtt_on_disconnect
        c_mqtt.join(MQTT_BROKER, MQTT_PORT)
        ret = c_mqtt.publish("sitewise", payload) 
        logger.information('MQTT: publish: ret: %s', ret)
        c_mqtt.disconnect()
    besides Exception as error:
        logger.error("MQTT: publish message: %s", error)

Conclusion

On this weblog, you could have realized how you should use an AWS IoT SiteWise gateway to gather information out of your industrial gear and ship it to many areas. You’ve realized easy methods to configure your gateway to ship information from AWS IoT SiteWise or AWS IoT SiteWise Edge to a customized vacation spot. Primarily based on pattern code, you could have seen how one can switch your information to a customized location on AWS and into your on-premise atmosphere. Study extra on the AWS IoT SiteWise product web page or on the AWS IoT SiteWise workshops.

Concerning the creator

Philipp Sacha

Philipp is a Specialist Options Architect for IoT at Amazon Net Providers supporting prospects within the IoT space. He joined AWS in 2015 as a normal Options Architect and moved in 2018 into the position of a Specialist within the IoT space.

RELATED ARTICLES

LEAVE A REPLY

Please enter your comment!
Please enter your name here

Most Popular

Recent Comments