Skip to Main Content
IBM Z Software


This portal is to open public enhancement requests against IBM Z Software products. To view all of your ideas submitted to IBM, create and manage groups of Ideas, or create an idea explicitly set to be either visible by all (public) or visible only to you and IBM (private), use the IBM Unified Ideas Portal (https://ideas.ibm.com).


Shape the future of IBM!

We invite you to shape the future of IBM, including product roadmaps, by submitting ideas that matter to you the most. Here's how it works:

Search existing ideas

Start by searching and reviewing ideas and requests to enhance a product or service. Take a look at ideas others have posted, and add a comment, vote, or subscribe to updates on them if they matter to you. If you can't find what you are looking for,

Post your ideas
  1. Post an idea.

  2. Get feedback from the IBM team and other customers to refine your idea.

  3. Follow the idea through the IBM Ideas process.


Specific links you will want to bookmark for future use

Welcome to the IBM Ideas Portal (https://www.ibm.com/ideas) - Use this site to find out additional information and details about the IBM Ideas process and statuses.

IBM Unified Ideas Portal (https://ideas.ibm.com) - Use this site to view all of your ideas, create new ideas for any IBM product, or search for ideas across all of IBM.

ideasibm@us.ibm.com - Use this email to suggest enhancements to the Ideas process or request help from IBM for submitting your Ideas.

ADD A NEW IDEA

IBM Z Common Data Provider

Showing 37

delete-trailing-blanks-in-data-receiver-csv-fields

Hi, The data receiver component is supposed to be optimal for sending data to splunk, yet it sends lots of csv fields to splunk with trailing blanks. Customer would like the tool to optionally delete trailing blanks from csv fields and hence prope...
over 1 year ago in IBM Z Common Data Provider 3 Future consideration

Stream WebSphere Liberty message.log to Logstash ELK

Recently application team intend to transfer WebSphere Liberty message.log and to display only particular info such as response time in MAGE. So, we config a stream for WebSphere Liberty message.log to be sent to Logstash ELK. From Data Streamer l...
over 1 year ago in IBM Z Common Data Provider 0

Support RACF Keyring keystore type.

Dear, Currently the Java keystore and the script importCertificate.sh is the only supported method for handling certificates to secure communications between the Data Streamer and subscribers and RACF® keyring is not supported. This approach is no...
almost 2 years ago in IBM Z Common Data Provider 0 Future consideration

Need function to run CDP Log Forwarder data source discovery function on demand

Provide a function to allow the Common Data Provider data source discovery function to run once, when commanded by a user. This would often be used when Discovery Interval and Pattern Discovery Interval are set to 0 (due to CPU usage requirements)...
over 3 years ago in IBM Z Common Data Provider 0 Planned for future release

enable capture when engine tasks are down for any reason

Currently if an engine task is taken down such as for product maintenance, i.e. the SMF engine, any records cut in the period it is down are lost and not streamed to the target. This is obviously an unacceptable loss of data in the real time monit...
over 2 years ago in IBM Z Common Data Provider 0 Planned for future release

Provide Batch configuration tool

This is a request to enhance and improve CDP configuration with a batch type interface. Currently CDP provides a web-based userinterface that is provided as a plug-in to z/OSMF for administer policies. For large implementations this is cumbersome,...
over 3 years ago in IBM Z Common Data Provider 1 Planned for future release

CDP CPU usage by workstream

We currently run a single log forwarder and datastreamer on each LPAR. They gather various sysouts and operlog entries for multiple applications to a number of different Splunk and Elastik servers.Require the ability to determine how much CPU each...
about 5 years ago in IBM Z Common Data Provider 1 Future consideration

CDPz: One datastreamer for multiple data gatherers from different sysplexes

Today, data gatherers and data streamer are locally defined to each LPAR. If we consider that Data Gatherers ( System Data Engine and Log Forwarder ) should stay local to each LPAR, it makes sense to be able to define only one Datastreamer for mul...
over 6 years ago in IBM Z Common Data Provider 0 Future consideration

Add field names to the streamed CSV when using Generic HTTP subscriber.

The Generic HTTP subscriber is a great option to be able to integrate with 3rd party and opensource tools. The CSV format is easy to exploit and adapt. However it has no header and only contains the raw data, which greatly reduces its usefulness. ...
about 2 years ago in IBM Z Common Data Provider 0 Planned for future release

CDPz lacks ability create Meaning Splunk Data to Rpt on when it exists in 1 SMF rcd

I am trying to create an CPC Utilization by Lpar which all data exists in the SMF type 70 rcd. This is currently being done by MXG, TDSz, MICS and other products, but the default rcds from CDPz is basically unusable. I tried to create a Custom Dat...
over 4 years ago in IBM Z Common Data Provider 1 Planned for future release