Data Staging Training
Dear all,
I am a certified BW 2.0 Consultant that also followed the delta courses for both 3.0 and 3.5. However, if I want to become a certified BW 3.5 Consultant, I need to do a complete certification again. I already received hardcopies of most of the documentation for 3.5 but I am missing the BW340. Can somebody supply me with a softcopy?
Best regards,
Robert Boneschansker
[email protected]
Well to be honest , there are some online institutes that do offer training but Ralph Kimball has some great books on it.
Mohammad Farhan Alam
Similar Messages
-
Difference between Data staging and Dimension Table ?
Difference between Data staging and Dimension Table ?
Data Staging:
Data extraction and transformation is done here.
Meaning that, if we have source data in flat file, we extract it and load into staging tables, we take care of nulls, we change datetime format etc.. and after such cleansing/transformation at then end, load it to Dim/Fact tables
Pros: Makes process simpler and easy and also we can keep track of data as we have data in staging
Cons: Staging tables need space hence need memory space
Dimension Table:
tables which describes/stores the attribute about specific objects
Below is star schema which has dimension storing information related to Product, Customer etc..
-Vaibhav Chaudhari -
Joining head- and item- records during data-staging
Hi experts,
I've got the following scenario:
I get data from 2 datasources, one (myds_h) provides documents-heads and the other one provides document-items (myds_i).
Unfortunately the myds_i-DataSource does not contain the head-fields (but foreign-key-fields with the reference to the head).
For the reporting I'd like to provide item-level data containing the document-head-information as well.
Which point of the data-staging in the BW-system would you recommend for doing this join?
Maybe some options:
a) I could enhance the myds_i-DataSource and do the join in the source-system.
b) I could enhance the item-data using the transformation between the item-PSA and an item-ODS.
c) I could enhance the item-data using a transformation between an item-ODS and an additional item/head-ods
d) I could enhance the item-data using the transformation between the item-ODS and the final InfoCube.
e) I could use an Analysis-Process-Chain and an InfoSet instead of the above mentioned transformations.
Thanks for your comments and input in advance!
Best regards,
Marco
Edited by: Marco Simon on Feb 13, 2012 3:52 PM - inserted one option.Hello Marco,
In your solution a) to d), you will have some delta pb. If header data modification can occur without any modification on the items, you will lose some header modification in your item DSO.
The easiest solution will probably be to handle the header data as a masterdata (header data as attributes of this masterdata), Header data will then be available with your item data. This solution may cause some performance problem if you have a lot of headers.
Another solution will be to build a transformation between you header DSO and your item DSO (every header modification will modify all items of the header). This will bring some ABAP
Regards,
Fred -
Difference bw the data staging of generic and appli specefic data sources
Hi,
Can anyone tell the difference between data staging of generic and appl specific data sources. Like we know that LO data stage in queued delta, update queue and BW delta queue, i want to know actually where the generic data stages before it is loaded into BW.
Thanks.Generic data sources are based on either a DB table/view, a function module or an ABAP query. So normally the data stages in the corresponding DB tables or you calculate it at extraction time. There is no update queue like in LO.
Best regards
Dirk -
We are now running SCM 4.1 which has BW 3.5 and are currently using a external BW system for data staging before pulling it over into APO.
Are there any consequences to doing the data staging on the APO BW side itself without using an external BW system? We can still use the external BW system for Reporting.without BW if you do APO/BW directly and then you use BW for reportiing the data will not be in sync.
-
Why normalize data before training it in Azure ML?
I've been experimenting with normalizing data before training a model, and I don't get it. I understand that training the model theoretically should work better when the data is first normalized, especially when using geometric algorithms like SVM.
But when I try to do this in Azure ML. I don't see any improvements. In fact, the results get worse half of the time.
For example, I tried creating a two-class prediction model using the German Credit Card UCI data set. According to this article, it's best to normalize the data using the tanh method:
http://azure.microsoft.com/en-us/documentation/articles/machine-learning-sample-credit-risk-prediction/
This is what I get when I try predicting column 21 in the data set:
No normalization --> UAC = 0.740
Normalize using Z-score --> UAC = 0.740
Normalize using MinMax --> UAC = 0.740
Normalize using Logistic --> UAC = 0.670
Normalize using LogNormal --> UAC = 0.733
Normalize using tanh --> UAC = 0.696
As you see, using the suggested tanh normalization results in almost the worst performance, and doing no normalization at all yields the best results.
Any suggestions what might be going on?Hi Andreas,
Feature normalization/scaling indeed helps a lot with gradient descent convergence. However, it shouldn't be seen as a perf-optimization technique only. It is an important data pre-processing step for algorithms which optimize a distance function. In
those cases the data should be normalized so that features contribute proportionally to the distance computation. Classifiers like the Bayes Point Machine don't work this way and hence don't need feature normalization, but SVMs certainly do.
Here is a decent read on the topic (reviewed by Schölkopf himself).
-Y- -
Upcoming SAP Best Practices Data Migration Training - Chicago
YOU ARE INVITED TO ATTEND HANDS-ON TRAINING
SAP America, Downers Grove in Chicago, IL:
November 3 u2013 5, 2010 `
Installation and Deployment of SAP Best Practices for Data Migration & SAP BusinessObjects Data Services
Install and learn how to use the latest SAP Best Practices for Data Migration package. This new package combines the familiar IDoc technology together with the SAP BusinessObjects (SBOP) Data Services to load your customeru2019s legacy data to SAP ERP and SAP CRM (New!).
Agenda
At the end of this unique hands-on session, participants will depart with the SBOP Data Services and SAP Best Practices for Data Migration installed on their own laptops. The three-day training course will cover all aspects of the data migration package including:
1. Offering Overview u2013 Introduction to the new SAP Best Practices for Data Migration package and data migration content designed for SAP BAiO / SAP ERP and SAP CRM
2. Data Services fundamentals u2013 Architecture, source and target metadata definition. Process of creating batch Jobs, validating, tracing, debugging, and data assessment.
3. Installation and configuration of the SBOP Data Servicesu2013 Installation and deployment of the Data Services and content from SAP Best Practices. Configuration of your target SAP environment and deploying the Migration Services application.
4. Customer Master example u2013 Demonstrations and hands-on exercises on migrating an object from a legacy source application through to the target SAP application.
5. Overview of Data Quality within the Data Migration process A demonstration of the Data Quality functionality available to partners using the full Data Services toolset as an extension to the Data Services license.
Logistics & How to Register
Nov. 3 u2013 5: SAP America, Downers Grove, IL
Wednesday 10AM u2013 5PM
Thursday 9AM u2013 5PM
Friday 8AM u2013 3PM
Address:
SAP America u2013Buckingham Room
3010 Highland Parkway
Downers Grove, IL USA 60515
Partner Requirements: All participants must bring their own laptop to install SAP Business Objects Data Services on it. Please see attached laptop specifications and ensure your laptop meets these requirements.
Cost: Partner registration is free of charge
Who should attend: Partner team members responsible for customer data migration activities, or for delivery of implementation tools for SAP Business All-in-One solutions. Ideal candidates are:
u2022 Data Migration consultant and IDoc experts involved in data migration and integration projects
u2022 Functional experts that perform mapping activities for data migration
u2022 ABAP developers who write load programs for data migration
Trainers
Oren Shatil u2013 SAP Business All-in-One Development
Frank Densborn u2013 SAP Business All-in-One Development
To register please use the hyperlink below.
http://service.sap.com/~sapidb/011000358700000917382010EHello,
The link does not work. This training is still available ?
Regards,
Romuald -
Upcoming SAP Best Practices Data Migration Training - Berlin
YOU ARE INVITED TO ATTEND HANDS-ON TRAINING
Berlin, Germany: October 06 u2013 08, 2010 `
Installation and Deployment of SAP Best Practices for Data Migration & SAP BusinessObjects Data Integrator
Install and learn how to use the latest SAP Best Practices for Data Migration package. This new package combines the familiar IDoc technology together with the SAP BusinessObjects (SBOP) Data Integrator to load your customeru2019s legacy data to SAP ERP and SAP CRM (New!).
Agenda
At the end of this unique hands-on session, participants will depart with the SBOP Data Integrator and SAP Best Practices for Data Migration installed on their own laptops. The three-day training course will cover all aspects of the data migration package including:
1. Offering Overview u2013 Introduction to the new SAP Best Practices for Data Migration package and data migration content designed for SAP BAiO / SAP ERP and SAP CRM
2. Data Integrator fundamentals u2013 Architecture, source and target metadata definition. Process of creating batch Jobs, validating, tracing, debugging, and data assessment.
3. Installation and configuration of the SBOP Data Integratoru2013 Installation and deployment of the Data Integrator and content from SAP Best Practices. Configuration of your target SAP environment and deploying the Migration Services application.
4. Customer Master example u2013 Demonstrations and hands-on exercises on migrating an object from a legacy source application through to the target SAP application.
Logistics & How to Register
October 06 u2013 08: Berlin, Germany
Wednesday 10AM u2013 5PM
Thursday 9AM u2013 5PM
Friday 9AM u2013 4PM
SAP Deutschland AG & Co. KG
Rosenthaler Strasse 30
D-10178 Berlin, Germany
Training room S5 (1st floor)
Partner Requirements: All participants must bring their own laptop to install SAP Business Objects Data Integrator on it. Please see attached laptop specifications and ensure your laptop meets these requirements.
Cost: Partner registration is free of charge
Who should attend: Partner team members responsible for customer data migration activities, or for delivery of implementation tools for SAP Business All-in-One solutions. Ideal candidates are:
u2022 Data Migration consultant and IDoc experts involved in data migration and integration projects
u2022 Functional experts that perform mapping activities for data migration
u2022 ABAP developers who write load programs for data migration
Trainers
Oren Shatil u2013 SAP Business All-in-One Development
Frank Densborn u2013 SAP Business All-in-One Development
To register please follow the hyperlink below
http://intranet.sap.com/~sapidb/011000358700000940832010EHello,
The link does not work. This training is still available ?
Regards,
Romuald -
Need Data warehouse Training in India...
Hi,
can somebody help me findout where I can learn Data warehousing in detail, may be along with some BI tools ?
I m using a Teradata data warehouse but my source database is in Oracle and frontends are BI tools like Business Objects,
Cognos, Actuate etc.
Thanks in anticipation,
DJ PanchalDear Dharmendra,
We are a cutting edge software training company, based in Ahmedabad, India, which brings real industry software training to you before anyone else. We were the pioneers in bringing the e-commerce/web technologies training in India and we are doing it again by bringing Business Intelligence (BI), Data Warehousing & related technologies, the hottest combination of software tools, which is growing in demand even in recession.
The Tools we will be giving training in are COGNOS, Business Objects, Actuate, Teradata and Oracle 9i.
Candidates from Overseas will get a cheaper Option in terms of Fees.
Candidates within India will get the opportunity to learn real industry Tools.
If you are Interested please contact us at [email protected] we will then provide you with further details.
Thank You -
Data Staging Doubt??
Hi Gurus,
I learning BI. I created couple of scenarios in BI where sataging is done starting from DataSource-> PSA->Transformation->InforCube. Data load is done using DTP and directly data is reaching from datasource to InfoCube (Not using any infosource inbetween). It's working fine for me without any hassles.
My question is if it can be done in this way then whats the use of InfoSource in sending data from DataSource-> InfoCube?
Can you please explain me the use of InfoSource in SAP BI scenarios? with staging steps in BI 7.0?
RitikaHi,
In BI 7.0, infosource is used if there is more than one target to be updated from the same data source.
Datasource ->(Transfer rules) infosource ->(Update rules) datatargets
Infosource is not mandatory anymore. Scenarios for flexible infosource:
A flexible infosource is necessary in order to use currency or unit conversion from the source datasource u2192 define infosource as an intermediate structure
You can use a flexible infosource as a uniform source for several targets; the infosource can be the target from different sources.
So we can use an InfoSource as an optional structure between a transformation and another transformation. This is necessary, for example, to consolidate several sources into several targets or to use InfoObject-based information.
Note: for direct InfoSources (For master data updates), there is no difference between old and new infosource, i.e. you can define a transformation as well as transfer rules.
Regards,
Priya.
Edited by: Priyadarshini K A on Aug 7, 2009 8:57 AM -
My department is new to Sharepoint (by which i mean I am very ignorant about it,) and at present we use it for setting up our training calendars. This is the process we currently use to do so:
We have an online learning course (let's call it Northern Book Cooking). This course runs for three days from 10:00-7:00, and it runs four times a year - say Jan 18, March 16, May 4, October 20. Each session is the same - if you can't sign up for January,
you can take the May course and so on.
In order to add these sessions to our calendar we first make a template, and then create a training event through Workflows. The training event asks for our course ID number, the last day one can register for it, its location and the start and end date.
The way we currently work is that we make the start and end day the same but change the times (so it starts on Jan 18 10:00 and ends Jan 18 at 7;00). Then we go over to the calendar,open up the first day, and click the recurring box. We then check the weekdays
and add the end date, so that Northern Book Cooking runs from Jan 18-20, and each day goes from 0:00-7:00. Then we go back to training templates and start all over again for the next date, March 16.
What I want to know is if there is a way we can cut-out that last part entirely. I want to be able to go to workflows, add a new training event, add the start date (Jan 18), add the end date (Jan 20), and add the times, and then have Sharepoint do the rest.,
so that ultimately the course runs for three days, and each day runs from 10-7, and I don't have to fiddle with the recurrence function 'manually.' Is there a way to do this? i know you can set sharepoint to recur a date monthly, but you don't appear to be
able to select the months, it just occurs as a pattern date (Jan 18, Feb 18, March 18) until you tell it to stop.
Any help? Or am I dreaming of the impossible?
We are currently using Sharepoint 2007, and will be updating to 2013 in the NEw Year. If there is a solution that works only in a later version, let me know anyways.Hi,
You can do this fairly easily using dbms_scheduler.evaluate_calendar_string which supports a rich standard calendaring syntax here is a function which prints out dates and an example. You can find more calendaring syntax examples here
http://download.oracle.com/docs/cd/B28359_01/server.111/b28310/scheduse004.htm#i1023132
and full rules for the syntax here
http://download.oracle.com/docs/cd/B28359_01/appdev.111/b28419/d_sched.htm#CIHDJEEB
Hope this helps,
Ravi.
set serveroutput on
create or replace procedure print_schedule_dates
schedule in varchar2,
start_date in timestamp with time zone default null,
number_of_dates in pls_integer default 10
is
date_after timestamp with time zone;
next_date timestamp with time zone;
actual_start_date timestamp with time zone;
begin
if start_date is null then
actual_start_date := dbms_scheduler.stime();
-- to make this work in 10.1, change the above line to
-- actual_start_date := systimestamp;
else
actual_start_date := start_date ;
end if;
date_after := actual_start_date - interval '3' second;
for i in 1 .. number_of_dates
loop
dbms_scheduler.evaluate_calendar_string
(schedule, actual_start_date, date_after, next_date);
dbms_output.put_line(to_char(next_date,
'DY DD-MON-YYYY (DDD-IW) HH24:MI:SS TZH:TZM TZR'));
date_after := next_date;
end loop;
end;
exec print_schedule_dates('freq=weekly;byday=fri', sysdate+4, 5 ) -
Legacy data upload for Training & Event Management
As per the requirement, the legacy /history data for training events is to be uploaded in the system. To start the process the data templates were prepared for uploading training objects like D (Business Event Group), L (Business Event Type) and E (Business Event) and these were uploaded. Also the relationships A003 between D & L and A020 between E & D were also maintained. As a part of training history the attendees were also maintained (uploaded) using the relationship A025 between E & P objects. However when we go to PSV2, then the tree structure does not reflect the name of the person who attended the event.
Please suggest and advice.Hi Nitin,
I have a query regarding the Legacy data. I wanted to ask you that how did you structure the Historical BEG and BET.
Did you create different BEG for the Current and the Legacy data.
But in that case there could be BET's common in both the current and legacy data with different object ids
For e.g. Under Current BEG and Historical BEG I may have a BET Technical Programmes with diffrent Ids say 60000000 and 60987655. This may create a problem when running reports.
Pls tell me strategy that you have used for creating the catalog.
Regards
Yashika Kaka -
Regarding Project Preparation , Requiremnts Gathering, User Training
Hai
Please send the documents for Project Preparation , Requirments Gathering, User Training Urgently to [email protected]
regards
andreaHi Andreas ,
The BW Project guidelines can be as follows ,
Stages in BW project
1 Project Preparation / Requirement Gathering
2 Business Blueprint
3 Realization
4 Final Preparation
5 GO Live & Support
Project Preparation / Requirement Gathering
Collect requirement thru interviews with Business teams /Core users / Information Leaders .
Study & analyze KPI 's (key figures) of Business process .
Identify the measurement criteria's (Characteristics).
Understand the Drill down requirements if any.
Understand the Business process data flow if any .
Identify the needs for data staging layers in BW (i.e need for ODS if any)
Understand the system landscape .
Prepare Final Requirements Documents in the form of Functional Specifications containing :
Report Owners,
Data flow ,
KPIs ,
measurement criterias,
Report format along with drilldown requirements .
2 Business Blueprint
Check Business content against the requirements
Check for appropriate
Info Objects - Key figures & Characters
Check for Info cubes / ODS
Check for data sources & identify fields in source system
Identify Master data
document all the information in a file follow standard templates
Prepare final solution
Identify differences (Gaps) between Business Content & Functional
specification. propose new solutions/Developments & changes if required at different levels such as Info Objects ,Info cube , Data source etc . Document the gaps & respective solutions proposed follow standard templates
Design & Documentation
Design the ERD & MDM diagrams for each cube & related objects
Design the primary keys/data fields for intermediate Storage in ODS
Design the Data flow charts right from data source up to Cube .
Consider the performance parameters while designing data models
Prepare High level / Low level design documents for each data model.--- follow standard templates
Identify the Roles & Authorizations required and Document it follow standard templates
final review of design with core BW users .
Sign off the BBP documents
3 Realization
Check & Apply Latest Patches/Packages ...in BW & R/3 systems.
Activate/Build & enhance the cubes/ODS as per data model designs...maintain the version documents .
Identify & activate Info objects / Master data info sources / attributes ,prepare update rules
Assign data sources .prepare transfer rules , prepare multi providers . prepare Info packages .
perform the unit testing for data loads .both for master data & transaction data .
develop & test the end user queries .
Design the process chains ,schedule & test
create authorizations / Roles assign to users ..and test
Apply necessary patches & Notes if any .
freeze & release the final objects to quality systems
perform quality tests .
Re design if required . (document changes, maintain versions)
4 Final Preparation
Prepare the final check list of objects to be released .identify the dependencies & sequence of release
perform Go Live checks as recommended by SAP in production system
keep up to date Patch Levels in Production system
Test for production scenarios in a pre-production system which is a replica of production system .
Do not Encourage the changes at this stage .
freeze the objects .
5 GO Live & Support
keep up to date Patch Levels
Release the objects to production system
Run the set ups in R/3 source system & Initialize Loads in BW
Schedule Batch jobs in R/3 system (Delta loads)
schedule the process chains in BW .
Performance tuning on going activity
Enhancements - if any
You can get some detailed information in the following link.
http://sap.ittoolbox.com/documents/document.asp?i=3581
Try to go to ASAP implementation roadmap.
https://websmp103.sap-ag.de/~form/sapnet?_SHORTKEY=01100035870000420636&_SCENARIO=01100035870000000202
Check the links below that gives you brief overview of the above steps .
https://websmp201.sap-ag.de/asap
http://www.geocities.com/santosh_karkhanis/ASAP/
ASAP
https://websmp201.sap-ag.de/asap
http://www.geocities.com/santosh_karkhanis/ASAP/
https://service.sap.com/roadmaps
https://websmp104.sap-ag.de/bi
***Please reward if useful.**
Hope these links helps you.
Regards
CSM Reddy -
Follow up Business Event by specific qualification date.
Hi Expert,
In the follow up business event, I want to transfer qualification to attendee with the start and end date of business event. For example, the business event date is held on 07.05.2007 11.05.2007 and the qualification will be given to all attendees by t-code: PV15. By standard the start and end date of given qualification is 11.05.2007 31.12.9999. My intention is to give the date as 07.05.2007 11.05.2007 instead. Is there any solution?
Here is the background of the question. We migrated the training history record from legacy system in to SAP and stored it as qualification, and start and end date of qualification are the duration of (start/end date) of training course.
Thanks in advance,
ECTSAKCHi,
Sorry my English is not well. I'll tried.
Our training process like this. When any employee who has attended the training course, he/she will be given the qualification according to the particular course. The qualification should be dated the same as the duration of training course. For example, the Techinical English course is held on 07.05.2007 - 11.05.2007. When the courese is finished, the user will do the follow-up activities by using T-code PV15. By default, the start date of given qualification is 11.05.2007 and end date is 31.12.9999. But our intention is to give the start date of assigned qualification as 07.05.2007 - 11.05.2007.
There is any user exit available to do that?
Thanks.
ECTSAKC -
Can we use 0INFOPROV as a selection in Load from Data Stream
Hi,
We have implemented BW-SEM BPS and BCS (SEM-BW - 602 and BI 7 ) in our company.
We have two BPS cubes for Cost Center and Revenue Planning and we have Actuals Data staging cube, we use 0SEM_BCS_10 to load actuals.
We created a MultiProvider on BPS cubes and Staging cube as a Source Data Basis for BCS.
Issue:
When loading plan data or Actuals data into BCS (0BCS_C11) cube using Load from Data Stream method, we have performance issue, We automated load process in a Process Chain. Some times it take about 20 hrs for only Plan data load for 3 group currencies and then elimination tasks.
What I noticed is, (for example/) when loading Plan data, system is also reading Actuals Cube which is not required, there is no selection available in Mapping or selection tab where I can restrict data load from a particular cube.
I tried to add 0INFOPROV into databasis but then it doen't show up as selection option in the data collection tasks.
Is there a way where I can restrict data load into BCS using this load option and able to restrict on cube I will be reading data from ?
I know that there is a filter Badi available, but not sure how it works.
Thanks !!
Naveen Rao KattelaThanks Eugene,
We do have other characteristics like Value Type (10 = Actual and 20 = Plan) and Version (100 = USD Actual and 200 = USD Plan), but when I am loading data into BCS using Load from Data Stream method, the request goes to all the underlying cubes, which in my case are Planning cubes and Actual Cube, but I don't want request to goto Actual Cube when I am running only Plan load. I think its causing some performance issue.
For this reason I am thinking if I can use 0INFOPROV as we use in Bex queries to filter the InfoProvider so that the data load performance will improve.
I was able to to bring in 0INFOPROV into DataBasis by adding 0INFOPROV in the characteristics folder used by the Data Basis.
I am able to see this InfoObject Data Stream Fileds tab. I checked marked it to use it in the selection and regenerated the databasis.
I was expecting that now this field would be available for selection in data collection method, but its not.
So If its confirmed that there is no way we can use 0INFOPROV as a selection then I would suggest my client for a redesign of the DataBasis it self.
Thanks,
Naveen Rao Kattela
Maybe you are looking for
-
Systemd as user, dual monitor, awesome, popups (long desc inside)?
Okay, this is quite difficult to describe, so please bear with me. I run systemd as user and awesomewm on a laptop which is normally plugged in to an external monitor for a dual screen. 64-bit Arch, fully updated, running [testing] I've got my own ho
-
What is the roles of technical and functional consultants in ESS/MSS area?
Gurus, What is the roles of technical consultants in ESS/MSS area? What is the roles of functional consultants in ESS/MSS area? Please help me see the differences. Thanks,
-
How to stop internal pricing changing in invoice.
Dear All, I have taken internal price as an condition in Sale order. if material price changes before actual delivery, then the internal price also changes in invoice. how can i stop from hapenning this. Please help. regards, GHS. Please update. Edit
-
Hello, first thing: sorry for my poor english. I'm running PS6 (CS6 Design Standard) on Mac OS Mavericks, all up to date. And there is some problem with printing: cmd+P results with "There was an error opening your printer. Printing functions will n
-
Suggested improvements for the Room Console application
The Room Console is a pretty nice tool to have out of the box to aid in visual debugging of LCCS room-state. However, I think it would be far more usable if the list of "My Rooms" were sortable and searchable. Once that list gets full it can be a hug