Creation of a (cost effective) Operational Data Store

I already have a db schema with staging and ods tables and would like to duplicate this environment (almost). The duplication would involve me creating a "separate" schema with all persistent data tables. Central to the "duplication" idea is economy versus efficiency (save space, time, effort whilst enabling a qualitative environment)!
The current ODS extracts data, transfers into staging tables, validates then loads into historical tables (persistent data). It would not make a whole lot of sense to duplicate this process entirely therefore, I was wondering what the best way forward would be?
There are 2 immediate questions that I see here:
The creation and loading of historical tables. Historical loading of this data took many weeks. Backing up and restoring data may provide a simple workaround but how practical is this approach?
One idea would be to duplicate the historical tables by creating a second (historical) schema and load both historical (persistent) environments from the one staging area after the daily deltas, thereby duplicating data (100s of GBs) not a genius idea, mind you.
Any ideas?

CDC is able to capture DML so you are all set. But since you have that obscure "some wiggle room for data in flight" then one technique I can think of I implemented at a financial institution was capturing the hash of the record before and after
(MD5) then compare the hash. I stored the hash in an audit table. It was a waste of money in my opinion as even after several years the process never flagged any difference.
Arthur
MyBlog
Twitter

Similar Messages

  • Can u create aggregates in ODS(Operational Data Store)??

    Can u create aggregates in ODS(Operational Data Store)??
    Thanks In Advance

    Hi,
    Refer the links:
    Re: Difference between InfoCube and ODS Object
    Re: aggregates
    With rgds,
    Anil Kumar Sharma .P

  • In ods what is meaning of operational.(operational data stoer-ODS)

    in ods what is meaning of operational.(operational data stoer-ODS)

    Hello Satish
    I guess you are very much clear about data and info....
    Data: everything and anything is data
    Info: If data is structured in a way that it gives a meaningfull information than it is info
    We use ODS as a data warehouse layer which contains meaningfull data which when loaded in cube( in structured way - Extended star schema in our case ) provides meaningfull information...so basically it is containing the data on which operations of DW are going to be performed...( it may be analyzing data, decision making on the data....)
    This is why it is called operational data store instead of Data store...
    Hope your have got your answer...
    Thanks
    Tripple k

  • How to find most cost-effective fwdg agent at the time of shipment creation

    Dear Friends,
    I have a requirement of realizing the best cost effective transporter(forwarding agent) in the shipment document. Then how to do using the Transportation functionality of the ERP?
    There is a scenerio like I want to transport my goods from L to M. In system say i have a route XX for that, which gets automatically determined in the shipment document.
    Now on route XX, if i have 5 operative transportors, say A, B, C, D, E.
    Transporter A - provides services km-wise, kg-wise, fixed price as well as truck wise
    (Say if you opt for km-wise, charges are 15 rs/km, 10 rs/kg, fixed price of 15000 for full truck from A to B)
    Transporter B - Similar to Transporter A with different charges
    (Say if you opt for km-wise, charges are 18 rs/km, 7 rs/kg, fixed price of 14500 for full truck from A to B)
    Similarly other tranporters also have their charges.
    and Now at the time of shipment creation, how can i have some standard method, so that the system suggest the most cost effective transportor for my delivery ie. shipment i want to make or probabaly if there is some standard way to get the list of all available transporters on a given route at the time of shipement creation.
    Please suggest.
    Regards,
    Dipti.

    Hi Sandeepan,
    Thanx for ur reply.
    As per my requirement, there might be say 100 routes existing into the system and each route might have say 10 different transportors available. Now they want to know 1) the list of transportors available on the given route - (run time at the time of shipment creation) to be put into the forwarding agent field 2) Of the available (say 10 transportors) who will be the best/cheapest.
    As per your suggestion, the functionality helps after we put in the fwdg agent and then estimate his associated cost. But as in this case, since there are so many routes (associated so many transportors) they will not know whom to put in the field.
    Hence want to know if there is some standard functionality of SAP which might help us i) identifying the list of transportors on a particular route or a given criteria - run-time ii) How to find the best among available OR i) SAP Directly suggesting the one and best transportor available on a given route.
    Please help.
    Regards,
    Dipti.

  • Cost-Roll take follow-up material upon discontinuation effective-out date ?

    Dear Expert,
    If we use discontinuation/follow-up material, during Cost-Roll (CK11N), SAP will take default cost from discontinued material even if follow-up material is effective. 
    <b>How to make SAP take cost of follow-up material after/on effective-out date ?</b>
    <b>Example :</b>
    Parent "A" contains component "B" qty 3, to be follow-up with component "C" upon discontinuation of "B" on effective-out dated 1st Nov'07, and like-wise to take std cost of "B' before 1st Nov'07 cost-roll.
    Thus, when I roll cost in December, I hope SAP would calculate for std cost of A to include std cost of "C" (for cost-roll after 1st Nov'07).
    <b>Setting :</b>
    <u>Material Master of "B" MRP4 view :</u>
    - Discontinuation Indicator = 1
    - Effective-out date = 1st Nov'07
    - Follow-up Material = "C"
    <u>BOM of Parent "A" :</u>
    - Component "B" indicate discontinuation group "A1"
    - Add component "C" and indicate follow-up group "A1".
    <b>Thanks for your guidance !</b>

    Hello,
    We are having the same issue: is a BOM for costing the only way to handle this ? Controlling dept. here fear they wouldn't be maintained as accurately as the production ones (and they are right). We were also thinking to set the phased out materials (B in your example) as cost irrelevant when you add to new material in the BOM...and start costing the father (A in your example) directly with the new component. I would appreciate a lot to hear other examples of dealing with this issue if anybody has.
    Thanks a lot,
    Olivier

  • Co code not seen for cost center master data creation in OKEON

    hI:
               In org tab of Cost center master data creation while configuring cost center hierarchy in OKEON i can not see organization tab. I have checked in OKKP Co area is assigned to co code but still the issue persist.
    Br

    hi
    you mean the tab is not displayed? strange, it seems that there should be a screen layout to control the ui, or any transaction variant is recorded. I tried in my side, the tab is there.
    @ Murali, you are right, but in that tab, not only company code can be configured, but also business area, functional area etc.
    best regards, Lawrence

  • Cannot attach data store shared-memory segment using JDBC (TT0837)

    I'm currently evaluating TimesTen during which I've encountered some problems.
    All of the sudden my small Java app fails to connect to the TT data source.
    Though I can still connect to the data source using ttisql.
    Everything worked without problems until I started poking around in the ODBC administrator (Windows 2K).
    I wanted to increase permanent data size so I changed some of the parameters.
    After that my Java app fails to connect with the following message:
    DriverManager.getConnection("jdbc:timesten:direct:dsn=rundata_tt60;OverWrite=0;threadsafe=1;durablecommits=0")
    trying driver[className=com.timesten.jdbc.TimesTenDriver,com.timesten.jdbc.TimesTenDriver@addbf1]
    SQLException: SQLState(08001) vendor code(837)
    java.sql.SQLException: [TimesTen][TimesTen 6.0.4 ODBC Driver][TimesTen]TT0837: Cannot attach data store shared-memory segment, error 8 -- file "db.c", lineno 8846, procedure "sbDbConnect()"
    The TT manual hasn't really provided any good explanation what the error code means.
    Obviusly I'v already tried restoring the original ODBC parameters without any luck.
    Ideas..anyone?
    /Peter

    Peter,
    Not sure if you have resolved this issue or not. In any case, here are some information to look into.
    - On Windows 32-bit, the allocation of shared data segment doesn't work the same way like on Unix and Linux. As a result, the maximum TimesTen database size one can allocate is much smaller on the Windows platform than on other platforms.
    - Windows error 8 means ERROR_NOT_ENOUGH_MEMORY: not enough storage is available to process this command.
    - TimesTen TT0837 says the system was unable to attach a shared memory segment during a data store creation or data store connection operation.
    - What was the largest successful perm-size and temp-size you used when allocating the TimesTen database?
    * One explanation for why you were able to connect using ttIsql is that it doesn't use much of the DLLs, whereas your Java application typically has a lot more DLLs.
    * As a troubleshooting step, you can try reduce your Temp-size to a very small size and just see if you can connect to the data store. Eventually, you may need to reduce your perm-size to get Windows to fit the shared data segment in the process space.
    By the way the TimesTen documentation has been modified to document this error as follows:
    Unable to attach to a shared memory segment during a data store creation or data store connection operation.
    You will receive this error if a process cannot attach to the shared memory segment for the data store.
    On UNIX or Linux systems, the shmat call can fail due to one of:
    - The application does not have access to the shared memory segment. In this case the system error code is EACCESS.
    - The system cannot allocate memory to keep track of the allocation, or there is not enough data space to fit the segment. In this case the system error code is ENOMEM.
    - The attach exceeds the system limit on the number of shared memory segments for the process. In this case the system error code is EMFILE.
    It is possible that some UNIX or Linux systems will have additional possible causes for the error. The shmat man page lists the possibilities.
    On Windows systems, the error could occur because of one of these reasons:
    - Access denied
    - The system has no handles available.
    - The segment cannot be fit into the data section
    Hope this helps.
    -scheung

  • The most cost-effective long-term solution, Upgrade or Buy Newer Computer?

    I am looking for the most cost-effective long term solution for me as an self-employed interior designer. I need help making a decision: Is it reasonable to upgrade my iBook G4 (RAM, operating system)(+bigger hard drive)(+buy new rechargeable battery)(+buy new portable power adapter)?; or do I need to abandon it and buy a newer apple computer (0-2 years old)?
    ABOUT MY MAC:
    Hardware Overview:
    Machine Model: iBook G4
    CPU Type: PowerPC G4 (1.1)
    Number Of CPUs: 1
    CPU Speed: 1.2 GHz
    L2 Cache (per CPU): 512 KB
    Memory: 256 MB
    Bus Speed: 133 MHz
    Boot ROM Version: 4.8.7f1
    I will appreciate your suggestions very much! Thanks a lot in advance!

    Hi Cali and Welcome to Apple Discussions,
    I think that decision would depend on how you plan to use this laptop. In addition it would depend on your client base and the image you need to project.
    I used to be a commercial producer. If I was still doing that today I'd need to project a successful up to date image. My clients at the large agencies were/are attracted to the latest technologies. I would have the hippest iPhone apps and probably a 15" Unibody MacBook Pro.
    I have a pro photographer client who wanted to buy an iMac a few years back but I insisted she buy a Mac Pro with at least two of the largest Cinema Monitors she could afford.
    She has thanked me multiple times since she says her clients are so impressed with her images when she displays them on those large monitors.
    So it would depend on how you're using the laptop and your client base.
    The first thing I see with your iBook is that your RAM is pretty minimal. If you update that would be the first thing to do.
    This is actually the iBook that I use and for me it's very cost effective. They're both 1.2 GHz machines. I use a 14" around the house and a !2" when we're traveling. It is very cost effective for me as I purchased them broken for under $125 total. I use salvaged drives and RAM. It works for me since my business is repair and the clients are impressed how well my 5 year old iBook works.
    Richard

  • Error while activating Data Store Object

    Hi Guru's,
    When I try to activate a data store object i get the error message :
         The creation of the export DataSource failed     
         No authorization to logon as trusted sys tem (Trusted RC=2).     
         No authorization to logon as trusted sys tem (Trusted RC=2).     
         Error when creating the export DataSource and dependent      Program ID 4SYPYCOPQ94IXEGA3739L803Z retrieved for DataStore object ZODS_PRA

    Hi,
    you are facing a issue with your source system 'myself', check and repair it. Also check if the communication user (normally ALEREMOTE) has all permissions needed.
    kind regards
    Siggi

  • Auto creation of Shipment cost document

    Hi Folks,
    Is anyone aware of how to create a shipment cost document using a function module?
    I need to create shipment cost document as well as update the condition values. But all this needs to be done via a BAPI or something in the background.
    Thanks,
    Mihir

    Seen an abaperr and check out the program "RV56TRSL" for your requirement. Through this program, you can Schedule the creation of Shipment cost document.
    If you're talking about automatic PO creation then you need to do the following:
    1. Assignment with relevant Purchasing org & group will be done at SPRO->LE->Transportation->Shipment cost->settlement->Assign Purchasing data
    2. To create the PO automatic you need to do these settings :-
    By selecting the shipment cost & item category for automatic PO (T_56) and selected the default PO type for the transaction.
    Path :- IMG > Material Mgt. >Purchasing >Define default Values for Document Type.
    Path :- IMG >Logixtic Execution >transportation -> Shipment Cost >Ship Cost Document >Shipment Cost Type & Item Category (T_56).
    3. Remember to check EKKN which is the table for account assignment in PO.
    Good luck

  • How to Set a Variable with data from Srouce Data Store

    Hello ODI Experts,
    I have created a Physical & Logical Schema and a Source Data store to pickup data from a database table.
    On the other hand, I have a few variable that I will pass in a web service call (ODIInvokeWebService tool).
    Would yo please guide how I can set variables from my source data store.
    Thanks & Regards,
    Ahsan

    Hello Bos/Damodhar/ODI Experts,
    Doesn't it gives me a less optimized approach picking one column per query (per variable)?
    Lets say, I have to pick 35 columns from a table and put those in 35 variables...It would mean running 35 queries for fetching one record from the database table.
    Doesn't it seem less performance effective (less optimized)..a little scary..any thing better that I can do to make it more optimized?
    Another question, what if multiple new values have come in the DB table, since I am using Refresh Variable, would this variable have multiple values in it?
    Thanks for all your help,
    Ahsan
    Edited by: Ahsan Asghar on 21-Jun-2011 07:46

  • TtRestoring a Data Store w/ a Large PermSize to a Smaller PermSize

    If have a backup of a data store created by ttBackup. I'd like to restore that data store into a data store with a smaller PermSize. The original data should fit in the new data store. However, I'm encountering errors doing this.
    I've done the following:
    - Ran ttBackup on the source data store.
    - The source data store has a PermSize of 16 Gb, but it's backup file is about 4.5 Gbytes.
    - On the target machine, I created an entry in the odbc.ini for the new data store, but with a PermSize=6144
    - On the target machine, I then run ttIsql target_dsn_name
    - This works okay and ipcs shows a shared memory segment with a size that correlates to the PermSize=6144.
    - Then that data store is ttDestroy'ed to get it out of the way.
    - Next I ran: ttRestore -fname source_dsn_name -connstr "DSN=target_dsn_name;Preallocate=1;PermSize=6144;TempSize=120" -dir . -noconn
    - This succeeds, but only because "-noconn" was specified.
    - When I 1st try to connect to the datastore by running: ttIsql "DSN=target_dsn_name;PermSize=6144;TempSize=120"
    - It fails with the following error:
    836: Cannot create data store shared-memory segment, error 22
    703: Subdaemon connect to data store failed with error TT836
    - The tterror.log contains:
    19:59:05.54 Err : : 3810: TT14000: TimesTen daemon internal error: Error 22 creating shared segment, KEY 0x0401a8b2
    19:59:05.54 Err : : 3810: -- OS reports invalid shared segment size
    19:59:05.54 Err : : 3810: -- Confirm that SHMMAX kernel parameter is set > datastore size
    19:59:05.54 Err : : 3820: subd: Error identified in [sub.c: line 3188]
    19:59:05.54 Err : : 3820: subd: (Error 836): TT0836: Cannot create data store shared-memory segment, error 22 -- file "db.c", lineno 9342, procedure "sbDbConnect"
    19:59:05.54 Err : : 3820: file "db.c", lineno 9342, procedure "sbDbConnect"
    19:59:05.54 Warn: : 3820: subd: connect trouble, rc 1, reason 836
    19:59:05.54 Err : : 3820: Err 836: TT0836: Cannot create data store shared-memory segment, error 22 -- file "db.c", lineno 9342, procedure "sbDbConnect"
    19:59:05.54 Err : : 3810: TT14000: TimesTen daemon internal error: Could not send 'manage' request to subdaemon rc 400 err1 703 err2 836
    19:59:06.45 Warn: : 3810: 3820 ------------------: subdaemon process exited
    Note that on the target machine total shared memory is configured for only 10 Gb, which is smaller than the size of the original data store.

    Hi Brian,
    ttRestore cannot be used for this purpose. The restored datastroe will always have the PermSize that was in effect when it was backed up. You cannot shrink an existing datastore. If you need to move the tables and data to a store with a smaller PermSize (assuming they will fit of course) then you need to use ttMigrate instead.
    Chris

  • 703: Subdaemon connect to data store failed with error TT9999

    All,
    I'm getting the following error whilst trying to connect to a TimesTen DB:
    connect "DSN=my_cachedb";
    703: Subdaemon connect to data store failed with error TT9999
    In the tterrors.log:
    16:39:24.71 Warn: : 2568: 3596 ------------------: subdaemon process exited
    16:39:24.71 Warn: : 2568: 3596 exited while connected to data store '/u01/ttdata/datastores/my_cachedb' shm 33554529 count=1
    16:39:24.71 Warn: : 2568: daRecovery: subdaemon 3596, managing data store, failed: invalidate (failcode=202)
    16:39:24.71 Warn: : 2568: Invalidating the data store (failcode 202, recovery for 3596)
    16:39:24.72 Err : : 2568: TT14000: TimesTen daemon internal error: Could not send 'manage' request to subdaemon rc -2 err1 703 err2 9999
    16:39:24.72 Warn: : 2568: 3619 Subdaemon reports creation failure
    16:39:24.72 Err : : 2568: TT14000: TimesTen daemon internal error: Deleting 3619/0x1558650/'/u01/ttdata/datastores/my_cachedb' - from association table - not found
    16:39:24.72 Err : : 2568: TT14004: TimesTen daemon creation failed: Could not del from dbByPid internal table
    16:39:24.81 Warn: : 2568: child process 3596 terminated with signal 11
    16:39:25.09 Err : : 2568: TT14000: TimesTen daemon internal error: daRecovery for 3619: No such data store '/u01/ttdata/datastores/my_cachedb'
    I've checked and the datastore does exist and is owned by the timesten UNIX user.
    ttversion:
    TimesTen Release 11.2.2.2.0 (64 bit Linux/x86_64) (tt1122:53396) 2011-12-23T09:26:28Z
    Instance admin: timesten
    Instance home directory: /home/timesten/TimesTen/tt1122
    Group owner: timesten
    Daemon home directory: /home/timesten/TimesTen/tt1122/info
    PL/SQL enabled.
    Datastore definition from sys.odbc.ini:
    [my_cachedb]
    Driver=/home/timesten/TimesTen/tt1122/lib/libtten.so
    DataStore=/u01/ttdata/datastores/my_cachedb
    LogDir=/u01/ttdata/logs
    PermSize=40
    TempSize=32
    DatabaseCharacterSet=AL32UTF8
    OracleNetServiceName=testdb
    Kernel parameters from sysctl -a:
    kernel.shmmax = 68719476736
    kernel.shmall = 4294967296
    Memory / SWAP:
    MemTotal: 2050784 kB
    SWAP: /dev/mapper/VolGroup00-LogVol01 partition 4095992
    I'm new to TimesTen and I'm planning on evaluationg it to see if it could solve an issue we're having. Any suggestions would be much appreciated.
    Thanks,
    Ian.

    Hi Ian,
    Can you please answer the following / provide the following information:
    1. What are your kernel parameters relating to semaphores set to? Is anything else on the mahcine using significant numbers of semaphores?
    2. Please provide the output of the following shell commands:
    ls -ld /u01
    ls -ld /u01/ttdata
    ls -ld /u01/ttdata/datastores
    ls -ld /u01/ttdata/logs
    3. Please provide an excerpt of the detailed message log (ttmesg.log) between around 16:38 and 16:40 (i.e. from a little while before the problem until after the problem).
    Thanks,
    Chris

  • Change Cost center valid date

    Hi, Export,
         Can we change the cost center valid date from 2007.10.01 to 2007.08.01? If it can change, Pls tell me how to change it.Thanks.
    best regards
    Park Han

    Well, it's a little late to reply to the original question, but I found this thread while trying to figure out how to do the same thing myself and thought I would update it with the answer in case anyone else stumbles across this thread while searching as I did.
    You can actually make a change in the valid from date by using a KS01 Create Cost Center transaction and referencing the original cost center to create the additional period you want to add. 
    In this case in KS01 you would fill out the fields in this manner:
    Cost Center: Existing Cost Center to be changed, let's say 1000
    Valid From 2007.08.01
    Valid to: 2007.09.30
    Reference Cost Center: 1000
    Controlling Area: Whatever the controlling area is
    In effect, rather than changing the cost center, you are creating the extra time period from the new from date to the original from date and referencing it to the cost center.
    The end result for us was that KS03 shows the cost center with the new from date and the old to date, which was the change we wanted.
    Not quite as obvious as a change, but it worked for us.

  • SCD Type 2 - effective/expiration dates

    OWB 11g 11.2.0.2
    I'm coming in cold to an environment where there's a business need to have a pair of start/end dates in a type 2 DIM, very simillar to the way the effective/expiration dates are maintained by OWB. The difference is that the end date in the closed record (becoming history) and the open date in the active record are not (the default) SYSDATE, but come from a source table/column in the mapping. There is no gap in these date/times.
    None of them seem to be working as intended. I know the concepts of type 2, but am not well-versed on working with them. I guess the first question is-- is it even possible to manipulate a column in a record as the dim controls the creation of a history record? It seems to me we are trying to mix manual and automated, and it won't work. But maybe there is a trick to this that I haven't yet learned.
    Any insights would be greatly appreciated.
    ~Peter

    Hi Peter,
    The difference is that the end date in the closed record (becoming history) and the open date in the active record are not (the default) SYSDATE, but come from a source table/column in the mapping.Is your source table already in SCD2 format (i.e. does it may contain multiple rows for single entity)?
    I guess the first question is-- is it even possible to manipulate a column in a record as the dim controls the creation of a history record?It is possible to change behaviour of populating of Effective Date column for SCD2 dimension - you can map Effective Date column from mapping parameter/source table columns/etc.
    But Expiration date column is poplated implicitly (for closed record) from value mapped to Effective Date column.
    Regards,
    oleg

Maybe you are looking for