Showing posts with label Kalido. Show all posts
Showing posts with label Kalido. Show all posts

Monday, February 25, 2008

Kalido Debuts its Business Modeling Tool - Free Download

Recently Kalido debuted its business modeling tool aptly named Business Information Modeler. This means that business modeling has a new home in form of this tool. Before people used to resort to all sorts of drawing applications from Vizio to PowerPoint for their business modeling needs. The good thing about this tool apart from slick interface that introduces gesture-based modeling, is its ability to connect to Kalido platform and import the model right into it, saving weeks worth of work. It can also push changes as and when it happens in the model itself.

Kalido also started a Google groups for this modeler. You can get a free copy of this tool as well as insights from seasoned business modelers if you join The Business Modeling Community Forum at Google. Worth a look for those in business modeling area.

Wednesday, September 12, 2007

Kalido User Group 2007


It's that time of the year again. It's time to get ready for Kalido User Group conference 2007. Kalido customers, current and prospective would have an excellent opportunity to learn from other Kalido customers who are driving business value from enterprise data warehousing and master data management solutions - whether in operational cost savings, increased sales, or improved business insight.

My company, Project Performance Corporation, also happens to be a Gold Sponsor for 2007 edition of KUG. Hence it will also be good opportunity for current and prospective customers to visit the partner pavilion where the latest solutions and services available will be demonstrated by PPC and other Kalido technology and systems integrator partner community.

So, see you all at KUG 2007.

Monday, August 20, 2007

No SAP DW, No Oracle BI, but Kalido. Why?

I am asked this question a number of times. I give the same answer that Bill Hewitt, Kalido's CEO and President, gave to Jim Ericson in a recent interview. It goes like this;

"All the ERP vendors are saying it's about SOA. SAP says we can provide everything: middleware, information management, transaction systems. Oracle says the same thing. What customer in their right mind wants to go back to buying every piece of their stack from one vendor? If you want to deliver your applications with a database flavor, then Oracle is a great place to go. If you want to deliver your middleware with a manufacturing flavor, SAP is a great place to go. But SAP has rolled out their second try at MDM [master data management], and it's failing because while they understand applications, they don't understand data. Oracle understands data but they don't understand applications. What we promote is maintaining separation between the five layers of your IT environment that move at different speeds: your infrastructure, your security, your information management, your applications and your user interface. Eventually, SOA is going to be the technology that brings all that together, but I don't think you should need to have your information management come out of your transaction systems or vice versa. What customers want is interoperability between the layers of the IT stack so they can make changes that suit their environment as they grow or shrink or change in other ways."

Wednesday, August 01, 2007

Kalido Error: -5092 : Handle does not belong to given list

This error appears when you try to import a transaction file definition. The only reason an import of transaction file definition throws this error is that there are existing transaction batches associated with the transaction file definition. This is a bug with 8.3 release of Kalido DIW. It will be fixed in the subsequent release of DIW. In the meantime you would have to delete associated transaction batches in order to successfully migrate a transaction file definition.

Friday, July 06, 2007

Kalido Setup and Default Passwords of Oracle Schemas

When setting up Kalido for the first time it asks you to create a number of schemas in the database. It also expects them to have default passwords, e.g. goldeneyex for WHSUSR or gatekeeper for GATEKEEPER schema.

However, there are strict password guidelines these days in almost every organization. A strict DBA group would never let you have the kind of passwords mentioned above. If DBAs do decide to have difference passwords for these schemas then it would be a problem. When you create a Kalido Gatekeeper it goes to gatekeeper schema and grabs the encrypted username and password for WHSUSR schema from USER_DETAILS table. If you have changed the password of GATEKEEPER schema itself then you can override that by checking the option ‘Force the GateKeeper to connect to the database as a specific user’ and then supplying the schema name and new password. However, even after that, gatekeeper configuration will fail. It fails because of the fact that it grabs the encrypted password for WHSUSR schema from USER_DETAILS table that equals the default password of WHSUSR, goldeneyex.

You would think that it is not a problem. You will use the KSetPass utility and change the password of WHSUSR schema. However, that would not work because for KSetPass utility to work you would need gatekeeper configured first. So now you are into a vicious circle. There are two ways to solve this puzzle. One is by asking Kalido Support to give you encrypted value of WHSUSR password and then update the USER_DETAILS table manually. Another way is by asking your DBA team to change the password of WHSUSR schema to default goldeneyex. Once you have configured your gatekeeper they can change it back to whatever they want and then you can change it accordingly using KSetPass utility.

The key here is to get your Kalido gatekeeper configured first and foremost. Once the gatekeeper is there, you can do pretty much whatever you want.

Tuesday, January 09, 2007

Think Big, Start Small

This is my second post in the series where I will go through a real life MDM (Master Data Management) implementation. The problem at hand and in fact generic to all MDM implementations is the presence of multiple operational and transactional systems. In this case however, it gets worse. As the cases progress from one system to another they are manually entered in the next system in the chain and hence all the problems. In our case there are four systems in all, w, x, y and z. Cases progress from w through z in that order. In each system cases are entered manually and these systems are as distinct and disconnected from each other as North and South Pole.

When starting any MDM initiative in such scenarios it would be tempting to put all the systems in project scope. But, that would have disaster written all over it. It is heartening to know that scope of this phase of MDM implementation has been scaled down to system y. This also doesn't mean that our vision would get so narrow that we would not even look at other systems. The right balance is to start the implementation from one system but the design should be flexible enough to accommodate remaining systems in coming phases.

Right now we are looking at data elements that are going to be part of the Gold Copy in this phase. In addition to that we are looking at data elements from other systems, so that we keep them in the back of our mind when we are working on the design and model.

Of all the problems in Enterprise Information Management MDM is one where taking it one piece at a time would be highly beneficial and effective.

Wednesday, January 03, 2007

Real World Master Data Management (MDM) Implementation

Okay! So you have been to conferences and attended the webinars. But how many real world MDM implementations you have come across. One or may be none. MDM is such a hot buzz word these days that every possible vendor out there has started offering a solution based on technologies that were stacked together in a hurry to cash in on the concept.
Just a few weeks ago we started a project for Master Data Management. When I was starting this project I started googling for any real world case studies on Master Data Management implementations. What I found was way less than my expectations. I thought it may not be a bad idea if I blog through the whole implementation process. I would share as much detail as possible on the implementation piece, keeping some of the less interesting details about the organization and the data hidden. Keep reading …

The latest #BigData #Analytics Daily! https://t.co/IvIGAevVLn Thanks to @mauriciogarciar @hivemaster @EnvironicsA #bigdata #analytics

The latest #BigData #Analytics Daily! https://t.co/IvIGAevVLn Thanks to @mauriciogarciar @hivemaster @EnvironicsA #bigdata #analytics Source...