Interfaces- full and delta
Hi everybody,
I have a question on Interfaces-full file and delta file...Like is there a way to find out(either by looking at the program or the job, related to that pgm) as to wht exactly the outputof that particular interface, gives u(like the full file or a changes file only)? Or is there ny other method to find it out in SAP R/3 environment? As far as my knowledge goes, I think in BW we can find out using the Infopackages, am I right?
Thank You,
Appreciated a quick response,
Kaya.
Please search the forums. This topic has already been discussed a lot of times.
Similar Messages
-
Full and delta Data loads in SAP BI 7.0
Hi,
Kindly explain me the difference between full and delta data loads in cube and DSO.
When do we select change log or active table(with or without archive) with full/delta in DTP for DSO.
Please explain the differences in data load for above different combinations.
With thanks and regards,
MukulPlease search the forums. This topic has already been discussed a lot of times.
-
Error in Full and Delta CPACache refresh.
Hi Experts,
In our production environent we have a SOAP -> XI - > PROXY scenario [Synchronous].
The Consumer of the WebService has reported an error:
500_Internal_Server_Error = SOAP:Server_Server_Error_XIAdapter_ADAPTER.JAVA_EXC
When I checked there are no messages in SXMB_MONI , but when I checked the Cache Refresh History I found that
there was error in both Full and Delta Cache Refresh. But the restart of the XI(Production) server resolved the issue.
The error is as shown below:
<?xml version="1.0" encoding="UTF-8" ?>
- <CacheRefreshError>
<EngineType>AE</EngineType>
<EngineName>af.pxi.su0956</EngineName>
<RefreshMode>F</RefreshMode>
- <GlobalError>
<Message>Couldn't parse Configuration Data cache update XML string from Directory.</Message>
org.xml.sax.SAXException: JavaErrors Tag found in cache update XML. at com.sap.aii.af.service.cpa.impl.cache.directory.DirectoryDataSAXHandler.startElement(DirectoryDataSAXHandler.java:157) at com.sap.engine.lib.xml.parser.handlers.SAXDocHandler.startElementEnd(SAXDocHandler.java(Compiled Code)) at com.sap.engine.lib.xml.parser.XMLParser.scanElement(XMLParser.java(Compiled Code)) at com.sap.engine.lib.xml.parser.XMLParser.scanContent(XMLParser.java(Compiled Code)) at com.sap.engine.lib.xml.parser.XMLParser.scanElement(XMLParser.java(Compiled Code)) at com.sap.engine.lib.xml.parser.XMLParser.scanDocument(XMLParser.java(Compiled Code)) at com.sap.engine.lib.xml.parser.XMLParser.parse0(XMLParser.java(Compiled Code)) at com.sap.engine.lib.xml.parser.AbstractXMLParser.parseAndCatchException(AbstractXMLParser.java(Compiled Code)) at com.sap.engine.lib.xml.parser.AbstractXMLParser.parse(AbstractXMLParser.java(Inlined Compiled Code)) at com.sap.engine.lib.xml.parser.AbstractXMLParser.parse(AbstractXMLParser.java(Compiled Code)) at com.sap.engine.lib.xml.parser.Parser.parse_DTDValidation(Parser.java(Inlined Compiled Code)) at com.sap.engine.lib.xml.parser.Parser.parse(Parser.java(Compiled Code)) at com.sap.engine.lib.xml.parser.SAXParser.parse(SAXParser.java(Compiled Code)) at javax.xml.parsers.SAXParser.parse(Unknown Source) at javax.xml.parsers.SAXParser.parse(Unknown Source) at com.sap.aii.af.service.cpa.impl.cache.directory.DirectoryDataParser.updateCentralCache(DirectoryDataParser.java:56) at com.sap.aii.af.service.cpa.impl.cache.CacheManager.updateCacheWithDirectoryData(CacheManager.java:871) at com.sap.aii.af.service.cpa.impl.cache.CacheManager.performCacheUpdate(CacheManager.java:640) at com.sap.aii.af.service.cpa.impl.servlet.CacheRefresh.process(CacheRefresh.java:104) at com.sap.aii.af.service.cpa.impl.servlet.CacheRefresh.doGet(CacheRefresh.java:53) at javax.servlet.http.HttpServlet.service(HttpServlet.java(Compiled Code)) at javax.servlet.http.HttpServlet.service(HttpServlet.java(Compiled Code)) at com.sap.engine.services.servlets_jsp.server.HttpHandlerImpl.runServlet(HttpHandlerImpl.java(Compiled Code)) at com.sap.engine.services.servlets_jsp.server.HttpHandlerImpl.handleRequest(HttpHandlerImpl.java(Compiled Code)) at com.sap.engine.services.httpserver.server.RequestAnalizer.startServlet(RequestAnalizer.java(Inlined Compiled Code)) at com.sap.engine.services.httpserver.server.RequestAnalizer.startServlet(RequestAnalizer.java(Compiled Code)) at com.sap.engine.services.httpserver.server.RequestAnalizer.invokeWebContainer(RequestAnalizer.java(Compiled Code)) at com.sap.engine.services.httpserver.server.RequestAnalizer.handle(RequestAnalizer.java(Compiled Code)) at com.sap.engine.services.httpserver.server.Client.handle(Client.java(Inlined Compiled Code)) at com.sap.engine.services.httpserver.server.Processor.request(Processor.java(Compiled Code)) at com.sap.engine.core.service630.context.cluster.session.ApplicationSessionMessageListener.process(ApplicationSessionMessageListener.java(Compiled Code)) at com.sap.engine.core.cluster.impl6.session.MessageRunner.run(MessageRunner.java(Compiled Code)) at com.sap.engine.core.thread.impl3.ActionObject.run(ActionObject.java(Compiled Code)) at java.security.AccessController.doPrivileged1(Native Method) at java.security.AccessController.doPrivileged(AccessController.java(Compiled Code)) at com.sap.engine.core.thread.impl3.SingleThread.execute(SingleThread.java(Compiled Code)) at com.sap.engine.core.thread.impl3.SingleThread.run(SingleThread.java(Compiled Code))
Can anyone tell me what is the cause of this error and how restart of the XI server resolved the issue?
Thannks,Guys,
I have deployed the below components and after this repository could not be started which gives the below error.
1000.7.11.10.18.20130529165100
SAP AG
sap.com
SAP_XIAF
1000.7.11.10.11.20130529165100
SAP AG
sap.com
SAP_XIESR
1000.7.11.10.15.20130529165100
SAP AG
sap.com
SAP_XITOOL
We have the same issue. Still the Restart did not solve ths issue.
Can anybody help? We are not able to start the XI Repository.
Our PI components are as below
com.sap.engine.services.ejb3.runtime.impl.refmatcher.EJBResolvingException: Cannot start applicationsap.com/com.sap.xi.repository; nested exception is: java.rmi.RemoteException: [ERROR
CODE DPL.DS.6125] Error occurred while starting application locally and wait.; nested exception is:
com.sap.engine.services.deploy.container.DeploymentException: Cannot activate endpoint for message-driven bean sap.com/com.sap.xi.repository*xml|com.sap.xpi.ibrep.server.jar*xm
l|CacheRefreshMDB^M
at com.sap.engine.services.ejb3.runtime.impl.DefaultContainerRepository.startApp(DefaultContainerRepository.java:315)^M
at com.sap.engine.services.ejb3.runtime.impl.DefaultContainerRepository.getEnterpriseBeanContainer(DefaultContainerRepository.java:106)^M
at com.sap.engine.services.ejb3.runtime.impl.DefaultRemoteObjectFactory.resolveReference(DefaultRemoteObjectFactory.java:55)^M
1000.7.11.10.6.20121015232500
SAP AG
sap.com
MESSAGING
Regards
Omkar -
Two Extractor logics for full and delta?
Hi,
I have a situation where in the logics for full upload and delta uploads are drastically dfifferent!
What is best way to deal it?
shuould i have two different extractors for full and delta ?
RajHi Raj,
I hope that u are working with the Generic Extractors. If the volume of the posting is too high then u need to restrict the data definitely from R3 side interms of the performance. And u need to maintain the Delta Functionality for the same.
If u cannt maintain the Delta from the R3 side then atleast try to restrict the same from BW side at the infopack level where u can extract the data based on the changed on and created with the two different IP's and update the same to the ODS(here make sure all the objects are with the overwrite).... and then u can maintain the Delta load from ODS to the Cube easily..
If the volume of the data is pretty less then u can use Full load to the ODS and then Delta to the Cube.
Thanks
Hope this helps.. -
There was a ongoing delta load happening between DSO 1 to DSO 2.
I deleted all requests from DSO 2.
And then ran first Full DTP ( DSO 1* to DSO 2) and immediately after full DTP ran Delta DTP ( DSO 1 to *DSO 2).
In Full DTP : There were 10,000 Records added into DSO 2
In delta DTP : there were 2,000 records added into DSO 2
I was bit surprised when I saw some data added in delta since before ( just 5 mins back ) I had ran full DTP therefore my understanding was delta would not bring anything.
By looking at various thread , I came to know that Full DTP get data from Active table and delta DTP get data from change log table.
However what if , if same records present in Active table and change log , so same records will be pick up by Full DTP run and also while delta DTP run , assume target DSO is in addictive mode then in such scenario duplicate record will be added and final result would be diffrent 9( i.e data get double up ).
How do we avoid such situation ?.
Any thoughts ?.
Edited by: Prashant on Aug 12, 2010 2:24 PMHI,
DTP FULL: if you have 1 request in psa today and your dtp extraction mode is full it will load the data into the target.
in the next day you have run the infopackage and you habve got another request wich means in totall you have got
2request in psa so what the full dtp does is it will try to load both of them and you will get error saying duplicate records.
DTP DELTA: it will load only the the fresh request from the source.
howe ever we dont do any initialisation in dtp the full and delta are only the extraction modes of data from the source. and its the infopackage which will fetch the deltas once the intialisation is done.
hope it helps.
Thanks & Regards
krishna.y -
Issue regarding Full and Delta InfoSpoke
Hi Experts,
I have created 2 Infospokes Full and Delta to extract data from DSO to SQL server, the settings in both the InfoSpokes are identical except for the extraction mode. However, the delta InfoSpoke is pulling the correct number of records from the DSO (360 records), but the full InfoSpoke is extracting only 196 records. I also tried loading the DSO using both Full/delta Infopackage and Full/Delta DTPs but still the results were same. Can anyone tell how this issue can be resolved.
Thanks,
RajanDear Rajan,
I'd try to help you about your question,
Your case it's curious, did you check the filter in both InfoPackage?, try to delete and create them again (it is very important check whole filter). Otherwise, I suggest use Open Hub Destination which BI 2004s have the same functional (export data)
I hope this suggestion can help you,
Luis -
Hi all,
I'm studing the new "System planning Deployment and BP Guide" of 10.3 beta.
I see that in a "Daily use test" they schedule "One full and three delta inventory scans per devices".
I did not know that in ZCM10 there were two types of inventory. Is it a new functionality? Anybody can explain to me some info?
Thank youmauro,
It appears that in the past few days you have not received a response to your
posting. That concerns us, and has triggered this automated reply.
Has your problem been resolved? If not, you might try one of the following options:
- Visit http://support.novell.com and search the knowledgebase and/or check all
the other self support options and support programs available.
- You could also try posting your message again. Make sure it is posted in the
correct newsgroup. (http://forums.novell.com)
Be sure to read the forum FAQ about what to expect in the way of responses:
http://forums.novell.com/faq.php
If this is a reply to a duplicate posting, please ignore and accept our apologies
and rest assured we will issue a stern reprimand to our posting bot.
Good luck!
Your Novell Product Support Forums Team
http://support.novell.com/forums/ -
Filling of set up tables for Full and Delta updates
Dear Gurus,
How do I go about filling the set up tables if I want to load historical data to BW using LO **** pit when set up table is running on delta ?
I want to know how many set up tables will be there for each data source, Because if set up table is running on delta and using the same table I want to load historical data also then will I have to delete and fill set up tables or there will be a separate set up table for full upload and delta ?
If we have multiple set up tables for one data source then in the T-code OLI*Bw, where * -> App logistics num, so to fill I just say OLI6BW, then which set up table it is going to refer.
Clarify my doubt and get rewarded with plenty of points.
Regards
MohanHi,
Filling up the set up tables depends on the datasource.
there are different T-codes for the respective extract structures
OLIIBW transaction PM data
OLI1BW INVCO Stat. Setup: Material Movemts
OLI2BW INVCO Stat. Setup: Stor. Loc. Stocks
OLI3BW Reorg.PURCHIS BW Extract Structures
OLI4BW Reorg. PPIS Extract Structures
OLI7BW Reorg. of VIS Extr. Struct.: Order
OLI8BW Reorg. VIS Extr. Str.: Delivery
OLI9BW Reorg. VIS Extr. Str.: Invoices
OLIABW Setup: BW agency business
OLIFBW Reorg. Rep. Manuf. Extr. Structs
OLIIBW Reorg. of PM Info System for BW
OLIQBW QM Infosystem Reorganization for BW
OLISBW Reorg. of CS Info System for BW
OLIZBW INVCO Setup: Invoice Verification
OLI7BW is the tcode for Sales Order.
Hope this solves Ur issue....
Assign points if helpful..
Rgs,
I.R.K -
Hi,
Is there any procedure (program or table) to find out list of fullloads and delta loads in BW system?
Regards
Jaya
Edited by: Jayalakshmi Busetty on Feb 19, 2010 6:04 AMHi Jayalakshmi,
Your can find in datatarges in Manage Button.
Thanks & Regards,
Venkat. -
Mixed Full and Delta from ODS to Cube
Hi,
we're facing the following problem:
We're loading data from an ODS to a Cube via Delta, every night. Due to reporting needs it's
nessesary we have to write some Infoobject-attributes to the cube.
This attributes will change sometimes. To display the current truth, we have to delete the old
assignments from the cube and update them with the new attribute values.
My idea was the following:
1. Identify the lines in the cube with the old attributes
2. Delete them via "selective deleting"
3. Load the lines with the new attributes from the ODS to the Cube
4. Proceed with delta upload from the ODS.
Is it a possible way or any better ideas.
Kind regards
LarsHi,
thanx for your reply. Of course we update the attributes:
For example:
Today Infoobjekt IO (Value IO-Value) has Attribute A with the value B.
Row in Cube: IO-Value, B, Keyfigure.
Tomorrow: Today Infoobjekt IO (Value IO-Value) has Attribute A with the value C.
So I have to correct the value of the attribute in the Cube.
Navigation Attributes can't be used.
Lars -
Extraction - full or delta or neither?
Hi all,
I have encountered a strange behaviour.
I am having a DataSource extracts data from R/3 to BW. In the "Test DataSource" screen, I put a range 2007.10 to the posting period, and it extracts all the records for 2007.10 for me.
But when I execute the InfoPackage in BW, and subsequently DTP and activation of DSO, I have got both 2007.10 and 2007.11 data in my DSO (which means all data ran into my DSO).
I suspect there is something wrong with my DS or DTP, because in my Data Transfer Process Monitor under "Details" tab, under "Process Request", there are multiple data package -- and it seems like this includes all the previous request and load that I have performed onto this DSO.
In my DTP extraction mode, I have tried both "Full" and "Delta" and in both cases, I got same problem.
Does anybody have any clue on how to resolve this? Is there a way to reset or I have missed out something? Appreciate for help. Thanks a lot.
- JKHi Alec,
If I delete the InfoPackage, would all the requests be deleted as well?
What I have tried is to delete the InfoPackage, delete data in DSO, and create Infopackage again, then execute DTP again.
As said, the PSA got only the data I need, but when loaded DTP and activate ODS, I got everything...
Still don't know why it behaves like that...
- JK -
Full upload after INIT and Delta
Hi Experts,
a quick question - If there is INIT and Delta infopackage running and I do another infopackage with full update option, will there be any issue in Delta?
when do I need to run repair? only when there is possibility of duplicate loading?
Thanks - SMThanks Srini/Vikram for your respone.
when there is deltas already running for the datasource , you can't request for Full but you can request using 'Repair full' which will acts and extract data as full load and it won't disturb present delta's
>>. Thanks Srini. thats bottom line... after delta, you need to run repair full not full. and repair full will also pull data from setup table (history). I was thinking repair full is for taking care of duplicate, but dups we need to remove manually.
where is repair full option in IP?
where I can see all INIT conditions for a infopackage? for example INIT was done 3-4 times.. year 2001-2002, 2002-2003 etc, so all these INIT crit should be stored somewhere?
Edited by: toti shu on Aug 31, 2010 5:08 PM
Edited by: toti shu on Aug 31, 2010 5:11 PM -
Where to even begin with the problames I'm having!? I have a MacBook Pro OS X 10.5.8, an iMac OS X 10.5.8, a 500GB Time Capsule, and an iPhone 4s. I know just enough about all of them to basically get myself into trouble... And It seems I have done a really good job at messing everything up!
I bought the iMac and Time capsule together in 2008. The Time Capsule didn't work properly for backups from the start (although it functions great as a router for my home network), and I spent several hours on the phone with applecare, took it to the Genius Bar etc., and it still has not ever worked as a continuous background backup. I did somehow manage to acquire one backup of the iMac with the TC before I did a reboot of the iMac a couple years ago. I can access the backup, but im not sure if the backup is actually on the time capsule or if it is saved on the HD. Furthermore, I have another backup (pretty sure it's a partial backup) from the MacBook Pro, and I don't know where that one is actually stored either. The Macbook Pro is so full that it is not functioning properly and i'm afraid it will die. I have subscribed to both Backblaze and Crashplan... The initial Backblaze backup completed on the MacBook, but I don't even have enough diskspace to download Crashplan. I need to get both Macs backed up to an external hard drive and I would like to save the backups I already have. I am also trying to start my own photograghy business, so I would like everything setup so that I can manage photos from both Macs and be condusive for storing and editing loads of photos. So here are my questions...
1. Can I (and should I) back up both Macs to one external hard drive?
2. Can I use the Time Capsule to route wireless continuous backups from the MacBook Pro to the external hard drive or would I need to have it wired every time?
3. Which external hard drive would be best for my situation? Would a RAID be best or is that overkill?
I am considering this one in either a 2TB or 3TB:
http://eshop.macsales.com/shop/firewire/1394/USB/EliteAL/eSATA_FW800_FW400_USB
4. Once everything is safely backed up, what is the best way to get my HD space freed up?
5. How do I set up the drives so that I'm not using all of my internal HD space for photos and music?
6. When I have enough space to upgrade to OS X Lion, will the cloud make a difference with how my photos are managed?
I know this has been extensive, and probably frustrating to those of you who know what your'e doing... Thanks so much for your time and patience!!1sweetORnurse wrote:
1. Can I (and should I) back up both Macs to one external hard drive?
You can (if it's large enough), but see below. You should probably create a second partition on the drive for the second Mac. You can also back them both up to the TC.
2. Can I use the Time Capsule to route wireless continuous backups from the MacBook Pro to the external hard drive or would I need to have it wired every time?
There are several options:
Since you have an iMac, by far the fastest, easiest, and most reliable way to back it up is to connect an external HD directly to it.
It's slower, but you can also back it up to the TC.
The laptop can be backed-up to the same drive as the one connected to the iMac, over your network.
Finally, although I wouldn't recommend it in this case, you could back up either or both to an external HD connected to the TC.
3. Which external hard drive would be best for my situation?
Most likely, a good quality external HD. If you're going to use Time Machine, see Time Machine - Frequently Asked Question #1 to be sure you get one large enough.
Would a RAID be best or is that overkill?
In most cases, overkill. For redundancy, duplicate backups made on a separate HD, with a different app, is a bit safer than a RAID. See FAQ #27 for some suggestions.
I am considering this one in either a 2TB or 3TB:
http://eshop.macsales.com/shop/firewire/1394/USB/EliteAL/eSATA_FW800_FW400_USB
That link doesn't work for me -- it redirects to the main page. I have one of their Mercury Elite Pro multi-interface drives, and it works fine. Most anything they sell will be fine, as long as it's large enough.
4. Once everything is safely backed up, what is the best way to get my HD space freed up?
Even before you do that, check for hidden large stuff on your Mac that you don't need. See Where did my Disk Space go? There may be some things there that will gain you a few GBs, at least.
Also, obviously, delete anything you don't need, especially video, photos, music.
Do you have duplicates of the same files on both Macs? If so, deleting one might save a lot of space.
Then, you need to consider where you're going in the future. Backups alone won't gain you any space -- the essential part of a good backup strategy is to have (at least) two copies of everything important in (at least) two different places.
So most likely, you need to move some big stuff, such as your iPhoto and/or iTunes libraries, or a large video collection, onto a different external HD, and let Time Machine (and/or other backup app) back it up along with your Macs.
5. How do I set up the drives so that I'm not using all of my internal HD space for photos and music?
Be sure to format them for a Mac (most come pre-formatted for Windoze). See #1 in Using Disk Utility.
Depending on what kind of data you want to move, you can just drag and drop, then delete.
And/or see:
Moving your iTunes Music folder
Moving your iPhoto '11 Library
Moving iMovie '11 footage to another drive
6. When I have enough space to upgrade to OS X Lion, will the cloud make a difference with how my photos are managed?
I don't think so.
What's on your TC now? Anything? If not, you might want to erase it and start backing up immediately, so you have at least some protection until you get the external HD(s). See To erase everything on the TC's internal disk in #Q5 of Using Time Machine with a Time Capsule.
Then see #Q1 there for setup instructions. If there's a problem, post back with details, or check Time Machine - Troubleshooting.
Also don't hesitate to post back if this isn't clear, or you still have questions. -
Hi,
I am re-posting a question from the following page here as I have the same problem now and I can't seem to find the answer even though the question has been marked as solved: http://social.technet.microsoft.com/Forums/en/windowsbackup/thread/daff108b-effa-4dad-807a-d604345975dd
Below is a copy of the question:
I have 2 backup drives, (Backup Drive 1 and Backup Drive 2)
Lets assume I have never performed a backup, so Day 1 is the first backup of the server.
Backup Drive 1 connected on Day 1 it performs a FULL Backup - this is expected.
Backup Drive 1 connected on Day 2 it performs an INCREMENTAL Backup - this is expected.
Backup Drive 2 connected on Day 3 it performs a FULL Backup - this is expected.
Backup Drive 1 connected on Day 4 it performs a FULL Backup - WHY? my understanding is that is should perform an Incremental Backup from Day 2
Backup Drive 2 connected on Day 5 it performs a FULL Backup - again why is it not performing an Incremental Backup from Day 3?
This means that in normal operation (Backup Drives alternate every day) We are performing FULL Backups everyday. In a few months this wont be an acceptable backup strategy; it will take too long.
I've used 'Configure Performance Settings' in the Windows Server Backup mmc to specify 'Always perform incremental backup' - it makes no difference.
If I look at the Backup Drive Disk Usage details it confuses me even more. It may be performing Full Backups everyday but it's definitely not storing Full Backup copies on the Backup Drives.
It seems to be that even when a Full Backup is performed, only the deltas are written to the Backup Drive so even though it takes longer it has the same effect as an incremental (so why doesn't it just perform an incremental?)
I don't understand Microsoft's use of Full and Incremental, it seems obtuse to provide a choice that appears to have no effect on the actual data written to the Backup Drive.
My real-world experience is at odds with that statements made in
The Official SBS Blog it states "every backup is incremental from a storage point of view" as well as "Because the wizard schedules differential-based backups' (nowhere in the Backup mmc have I seen any reference or options for differential),
"Backup runs quickly" and "...works the same for multiple disk rotation." (This is simply not the case with a 2-disk rotation. It runs a lengthy FULL Backup every time.)
The backup has been configured using SBS Console, runs once a day at 16:00. 2 Backup Drives, alternating daily.
Can anyone clarify Windows Backup operation for me?
I'd appreciate any feedback at all, thanks.Optimizing Backup and Server Performance
(Windows Server 2008 R2)
http://technet.microsoft.com/en-us/library/dd759145.aspx
Even if you choose the option to create incremental backups, Windows Server Backup will create a full backup once every 14 days, or after 14 incremental backup operations, to reduce the risk from disk corruptions.
Of course, this is for R2.
Merv Porter [SBS-MVP]
============================
"Merv Porter [SBS-MVP]" wrote in message
news:a1ca618e-ad66-4770-8c39-21285a08f671...
Interesting post...
WBAdmin to remote backup server
http://social.technet.microsoft.com/Forums/en-US/windowsbackup/thread/764fe9a4-960e-4e90-b8fb-8e7581752a9d#520f38fe-149c-4424-9c0b-54695297e919
In Windows Server 2008, there are several reasons which may cause the backup to be full instead of incremental
1. Backup on the target is deleted/not present.
2. Source volume snapshot is deleted, from which the last backup was taken.
3. 7 days have passed or 6 incremental backups have happened since the last full backup.
4. Churn on the backup source is high (more than 50%)
Abhinav Mathur[MSFT] Microsoft
Merv Porter [SBS-MVP]
============================
"Les Connor [SBS-MVP]" wrote in message
news:0053cd83-75d1-4dbc-8182-ae67cadf4780...
I believe it's how backup is designed, as I see the same thing. Why it works
the way it does is another story though, I don't know the answer to that.
Les Connor [SBS MVP]
"Kurni" wrote in message
news:[email protected]...
> Hi Les,
>
> Thank you for your reply.
>
> Are you saying that that's how the backup is designed? What I (and the
> original poster of the question) have in mind is actually different (each
> disk should have their own base backup).
>
> Quoting from the original question:
>
> Backup Drive 1 connected on Day 1 it performs a FULL Backup - this is
> expected.
> Backup Drive 1 connected on Day 2 it performs an INCREMENTAL Backup - this
> is expected.
> Backup Drive 2 connected on Day 3 it performs a FULL Backup - this is
> expected.
> Backup Drive 1 connected on Day 4 it performs a FULL Backup - WHY? my
> understanding is that is should perform an Incremental Backup from Day 2
> Backup Drive 2 connected on Day 5 it performs a FULL Backup - again why is
> it not performing an Incremental Backup from Day 3 ?
> Please let me know if I actually misunderstand how windows server backup
> work.
>
MVP - SBS -
Setup table and delta mechanisim
Hi,
Can anyone explain in detail setup table concept in LO extraction and
Delta mechanisim?
SridharHi
LO Cockpit Step By Step
Here is LO Cockpit Step By Step
LO EXTRACTION
- Go to Transaction LBWE (LO Customizing Cockpit)
1). Select Logistics Application
SD Sales BW
Extract Structures
2). Select the desired Extract Structure and deactivate it first.
3). Give the Transport Request number and continue
4). Click on `Maintenance' to maintain such Extract Structure
Select the fields of your choice and continue
Maintain DataSource if needed
5). Activate the extract structure
6). Give the Transport Request number and continue
- Next step is to Delete the setup tables
7). Go to T-Code SBIW
8). Select Business Information Warehouse
i. Setting for Application-Specific Datasources
ii. Logistics
iii. Managing Extract Structures
iv. Initialization
v. Delete the content of Setup tables (T-Code LBWG)
vi. Select the application (01 u2013 Sales & Distribution) and Execute
- Now, Fill the Setup tables
9). Select Business Information Warehouse
i. Setting for Application-Specific Datasources
ii. Logistics
iii. Managing Extract Structures
iv. Initialization
v. Filling the Setup tables
vi. Application-Specific Setup of statistical data
vii. SD Sales Orders u2013 Perform Setup (T-Code OLI7BW)
Specify a Run Name and time and Date (put future date)
Execute
- Check the data in Setup tables at RSA3
- Replicate the DataSource
Use of setup tables:
You should fill the setup table in the R/3 system and extract the data to BW - the setup tables is in SBIW - after that you can do delta extractions by initialize the extractor.
Full loads are always taken from the setup tables
1. Data Flow
Relevant T-Codes
a) Initializtion
LBWE: Maintain Extract Structure, DataSource, Active Status, Job Control, Update Mode
SBIW, then follow tree menu path: Settings for App-Specific DS(PI) -> Logistics -> Managing Extract Structures -> Initialization -> App-Specific Setup of Statistical Data -> Choose XX Application -> Delete/Fill Setup(reconstructing) Table content
RSA3: Check content of setup table
b) Delta upload
SM13: Display update table/Extraction Queue content
RSA7: Display Delta Queue content
Edited by: Tapashi Saha on Jul 24, 2008 7:33 AM
Maybe you are looking for
-
Key Field missing in transformation
Hi I am trying to use BI7.0 transformation (1-1 mapping basically) to send data from one DSO to another DSO Source : DSO1 Key fields : Account ,Itemid, Position number Target: DSO2 Key fields : Account ,Itemid, Position number However when i create t
-
Credit control field not visible in customer master after checking Spro
Hello Gurus, I have configured Automatic Credit Control In IMG. I have checked all the necessary spets required for the avtivation of Automatic Credit Control. But I am facing in problem and will appreciate all your help for this problem. In Customer
-
Hi All, I am facing issue in provisioning users from oim to Ad using AD 11g Connector. but in history Group Membership Insert rejected every time but all other task is working fine . its showing error message as: Response:Connection failed Response D
-
Transporting objects ? Please suggest
We have a landscape of DEV, QA & PROD for BI & R/3. So how will the objects move from DEV to PROD, is it from DEV to QA and QA to PROD (OR) DEV to QA and DEV to PROD. Please suggest me the clear process of transportation of BI Objects from DEV to PRO
-
Experts please help me.......
Hi, i want to create table script in oracle as table name is in other database and i have to create same table with TEMP prefix in table name. So please help me. For example : Create table 'Temp_'||table_name as select * from users_table@database But