logo

Are you need IT Support Engineer? Free Consultant

SAP DMS Client – A Desktop Companion for Cloud Int…

  • By sujay
  • 29/04/2026
  • 7 Views

Why I Built This

While looking into Cloud Integration data archiving, I came across Kishor Gopinathan's blog post SAP Cloud Integration – Data Archiving to BTP SDM. It walks through the full setup — creating service instances, building the CloudIntegration_LogArchive destination, and activating archiving via API.

I followed it while onboarding onto a new Integration Suite landscape, and it worked great. But I noticed that the workflow assumes we have a REST client like Postman at hand to call APIs, and some familiarity with CMIS to browse the archived content. Not everyone on the team has that setup ready — and honestly, assembling JSON payloads and fetching OAuth tokens just to activate archiving felt like unnecessary friction for what should be a straightforward task.

So I built a small desktop app to lower that barrier: the SAP DMS Client. I built it with the help of AI tooling, which made it realistic to go from idea to working desktop app within my onboarding timeline. It reads our DMS and Cloud Integration service keys and gives us a UI for managing repositories, exporting destination files, activating archiving, and browsing archived content — no REST client or CMIS knowledge required. The code is open source, so anyone can inspect and adapt it.


What It Does

The app is built with Electron, React, and Material UI (following SAP Horizon design guidelines). Once we load our two service key files — one for DMS, one for Process Integration Runtime — we get four things:

  • Repository management — create, list, and delete DMS repositories without CMIS API calls
  • Destination export — generate a CloudIntegration_LogArchive.json with all credentials pre-filled, ready to import into BTP Cockpit
  • Archiving activation — activate Cloud Integration data archiving in one click (the app handles the OAuth2 token exchange)
  • File explorer — browse the YYYY/MM/DD folder structure that Cloud Integration creates in the repository

The destination export is probably the biggest time saver. Instead of manually extracting clientId, clientSecret, tokenServiceURL, and the SDM endpoint from the service key JSON, the app assembles the full destination file automatically. We just import it into BTP Cockpit and move on.

Here's what the generated file looks like:

{
  "Name": "CloudIntegration_LogArchive",
  "Type": "HTTP",
  "URL": "https:///browser",
  "Authentication": "OAuth2ClientCredentials",
  "RepositoryId": "",
  "clientId": "",
  "clientSecret": "",
  "tokenServiceURL": "https:///oauth/token"
}

No copy-paste errors, no missed fields.

One thing to note: when we import the destination into BTP Cockpit, all fields are populated automatically except the client secret — BTP doesn't include it in the import format for security reasons. We'll need to grab the clientSecret from our DMS service key file and paste it manually into the destination configuration in the Cockpit after import.


What We Need Before Starting

The same BTP services from the original blog:

Service Plan What We Need

Integration Suite enterprise / standard Cloud Integration capability activated
Process Integration Runtime api Service instance + service key
Document Management Service, Integration Option standard Service instance + service key

Setup details: Initial Setup for DMS and Creating Service Key for Cloud Integration.

The app is on GitHub at github.com/sap-ef/da-dms-client. The easiest way to get started is to download a pre-built installer from the Releases page:

  • macOS  .dmg file
  • Windows  .exe installer

Alternatively, we can build from source with Node.js 18+:

git clone https://github.com/sap-ef/da-dms-client.git
cd da-dms-client
npm install
npm run electron:dev

The Workflow in Practice

Once the app is running, the flow follows a specific order — each step depends on the previous one:

  1. Load the DMS service key in Settings — this connects us to our Document Management Service instance
  2. Create a repository — we need a repository before we can generate a destination (the destination includes the repository ID)
  3. Export the destination file — the app generates the CloudIntegration_LogArchive.json with all the values from the service key
  4. Import the destination into the Integration Suite subaccount — in BTP Cockpit under Connectivity > Destinations (remember to fill in the client secret manually after import)
  5. Load the Cloud Integration service key in Settings — this connects us to the Process Integration Runtime instance
  6. Activate data archiving — this only works if the destination already exists in BTP, otherwise Cloud Integration won't know where to write the logs

Eddiedu_9-1777322090691.Png

Eddiedu_10-1777322098814.Png

 

Don't expect logs to appear immediately. As described in the original blog, the log collection process runs on a 7-day cycle — so it may take several days before the first archived files show up in the repository.

After archiving is active, the File Explorer lets us browse what Cloud Integration has stored. The folder structure is date-based (2026/04/15/) with .zip archives of message processing logs for each day.

Eddiedu_11-1777322124543.Png


Conclusion

Kishor Gopinathan's original blog remains the foundation for understanding Cloud Integration data archiving to SDM. The DMS Client just makes the repetitive parts easier so we can spend less time on plumbing and more on the integration itself.

The project is open source at github.com/sap-ef/da-dms-client. If you've dealt with Cloud Integration archiving in your landscape, I'd be curious to hear how you've approached it — or if there are other parts of the DMS workflow that could use better tooling. Drop a comment or open an issue.


Source link

Leave a Reply

Your email address will not be published. Required fields are marked *

//
Our customer support team is here to answer your questions. Ask us anything!
👋 Hi, how can I help?