Clarifications required on Data load update types...

Hi Friends,
I have seen many previus threads in the forum regarding Init, delta  Full update and Full Update with Repair.
Can some one expain me clearly the following?
1: What is Repair with Full update. When do we use this ?Give me some example.
2: If a delta failed to ODS, what should I do?
3: If delta fails from ODS to Cube ?
4: When we use Full update ? Every time when we Full update wether the records get duplicated or not?
Advance thanks for your answers.
Sudha

What is Repair with Full update. When do we use this ?Give me some example.
You can go in for a repair full request when you have missed any delta loads or there are data corruption issues. By doing a full repair load, you can ensure that your data is correct and has good integrity.
With this option you can continue to use your existing delta and not worry about resetting the delta.
Re: BIW
2: If a delta failed to ODS, what should I do?
Chk whether the data has come to PSA for all the package then u can update the data from PSA. else if extraction part is not over then u need to find out what is the error. force the request to red in ODS and delete it from target. rectify teh error and reload it
3: If delta fails from ODS to Cube ?
Need to force the request to red in Cube. reset the datamart(tick in ODS) then identify what is the error rectify it then load it(In this case too if data is there in PSA tehn no need to rest the data mart u can pull it from intermediate PSA itself)
4: When we use Full update ? Every time when we Full update wether the records get duplicated or not?
Full will pull records from setup dates. It depends on ur datatarget settings(Duplication will occur). In many cases the previous full request will be deleted and a new full will be loaded.
Regards,
B

Similar Messages

  • Automate dimension/data load update in Planning?

    Hi all,
    Without App Link and Translator (clients not buy), how to automate dimension/data load update in the planning? Any sample?
    Regards,
    Kenneth

    Hi John,
    I did the tests in Windows 2003 Server with Planning 9.3.1.
    1. 'bug 6829439: Task Flow is always set to active although the job has completed successfully'.
    I had a flow with 2 tasks. In this case when I launch the job it remains in a '4% complete' state because once the first task is completed, the systems doesn't put the task in 'completed' state and thus doesn't start the second one.
    2. 'bug 6785224: Manage TaskFlow does not show with interface tables'.
    When working with interface tables one cannot schedule taskflows.
    A colleague of mine working for another client has already opened up calls with Hyperion regarding this issues. Hyperion provided the bug numbers and informed us they will be fixed in version 9.5.
    3. Now I am testing ODI connectors on a Planning 9.2 systems and have some small problems (I've just opened a thread on this forum).
    Daniela S.

  • Working with Data Loading page type

    All,
    i am creating a page of type="Data Loading" so i can load various data. However i want to bypass the process part where we map the fields/columns(i.e Data/Table mapping) because all data i load are stored in only one column so this mapping part is unnecessary. My requirement is to load the data then when you click next button it should take you to the "Data Validation" part then finally "Data Load Results" part.
    How do i go about this?
    apex 4.1.1

    You might try to mimic a button-press for the [Next] button. Probably just a submit will do after page rendering (so create a Dynamic Action that submits immediately after load)

  • Coding Required for Data Load...

    Hi All,
    I am stuck up in an issue where I know the logic but unable to write a code. I have the requirement as follows :-
    I have the records in Excel as given below:-
    Fiscal Year          2004
    Key                     004
    Capitalization       A
    Norm Debts         B=0.7*A
    Rate of Interest    C
    Repayment          D=B/10
    Op. Balance         D=B/10
    Cl. Balance          E = B-D
    Total Interest        G=E+F/2*C
    The above records are the data which I am loading first time and only one record. The cube shud be populated with the data till Closing Balance becomes zero.
    So my logic will be
    First record by excel and second onwards is :-
    Fiscal Year         2004 +1 (It shud add 1 to every record till closing balance becomes zero)
    Key                    004 (Constant till Closing balance becomes Zero)
    Capitalization      A (Constant till Closing balance becomes Zero)
    Norm Debts        B (Constant till Closing balance becomes Zero)
    Rate of Interest    C (Constant till Closing balance becomes Zero)
    Repayment         D (Constant till Closing balance becomes Zero)
    Opening Balance   (Closing Balance will become Opening Balance here) = B-D
    Closing Balance     (Closing balance will now again will be debited by repayment that is B-2D)
    Total interest        = (Opening Balance + Closing Balance)/ 2 * Rate of Interest
    So next record will be opening balance = B-2D and Closing Balance will be B-3D and the record shud be automatically fetched till Closing Balance becomes zero.
    Any help will be assigned with points.
    Thanks and Regards,
    Sangini Mathur.

    Hi Sangini,
    I think in ur case u have to LOOP aginst source_package
    in the start routine only then you can get ur requirement.
    Data: w_temp1 like source_package-cl.balance.
    while w_temp1 eq 0.
    Put all your calculations here
    endwhile.
    Hope it helps
    Bhaskar
    Edited by: shanthi bhaskar on Jun 3, 2008 2:34 PM

  • Help required in data loads

    Hi all,
    We have actually installed some patch in R/3 system. so now we are testing BW3.5 system  whether the full and delta loads are working or not.
    Full loads are not having any problem but few delta loads are having problems . what we did is -
    1) Re- initialize the delta loads .
    2) Put/ change  some records in R/3 system
    3) Schedule delta loads in BW .
    But 0 records are coming in BW  though in R/3 the records are present.
    Kindly help.

    Hi
    You mentioned that no recorde are moved to BW.
    1)  Did you  check in R/3 side weather the data is in RSA7 or not?.
    2)  If not check in SM13 (V3) and lBWQ(Qued Delta).Activate the data source.
    3)  Replicate the datasource in BW and
    4)  Run the Program RS_TRANSTRU_ACTIVATE_ALL in SE38.
    5)  Perform data Loading.
    You can get data in this way.
    I think this will help you.
    Regards,
    Siva

  • Urgent help require-ASO data loading

    I am working on 9.3.0 and want to do incremental data loading so that it wont affect my past data.I am still not sure that it is doable or not .Now the question is
    do I need to design a new aggragation after loading data.
    Thanks in advance

    Hi,
    The ASO cube will clear off all the data , if you make any structural changes to the cube ( i.e if you change your outline ,and you can find out what exactly can clear off hte data ,for example , if you add a new member ,then ,it clears off , and in other case ,if you just add a simple formula to the existing member , it might not clear off the data ).
    If you dont want to effect the past data ,and yet you want to incrementally load , then ensure that all the members are already in the otl( take care of time dimension , let all the time members be there in place) and then you can load in ,using the option of 'add to existing values' while you load .
    But remember , you can only do ,without any structural changes. Else , you need to load all together again
    Its good if you design aggregation , as it helps in retrieval performance, but its not mandatory.
    Sandeep Reddy Enti
    HCC

  • R3 Table Required for data load status

    Hi all,
    I am in 3.x version so rsstatmanpart(fast table only available in bi7) wont work.
    I want the no of records added and transferred for a specific cube on a specific date.
    Thanks in advance.

    Check Tables RSMONICTAB, RSMONFACT, RSMONICDP
    Hope this helps..
    /pradeep

  • Trace data load errors in more details

    Hi all.
    During data load (update rules) I get error msg in Load Monitor "Arithmetical errors or conversion errors found in routine ROUTINE_0001 record 400"
    First of all there is no any ROUTINES in update rules, only 1 FORMULA. So, why monitor points to  some ROUTINE_0001?
    Second, is there any other opportunity in BI to trace data load/update errors in more detail?

    Hi,
    This error is because of the internal conversion routines run at the time of loading. While loading the data there might be a chance of some invalid characters (non numeric) coming in the fields with data type  NUMC/AMOUNT/QUANTITY.
    The arithmatic operations can not be performed on such data. So check if this is the reason why you are getting the above error.
    Regards,
    Yogesh.

  • Error Message after data load (error is "1130610")

    We load data using a data load rule. The data seems to load fine except we get the error message "ERROR - 1241101 - Unexpected Essbase error 1130610" on last line below? Any thoughts? Is there a guide somewhere that explains in details all the various error codes and what they refer to? We're using Essbase 6.5.1. Thanks!======================================= OK/INFO - 1003037 - Data Load Updated [364875] cells. OK/INFO - 1003024 - Data Load Elapsed Time : [428.323] seconds. ERROR - 1241101 - Unexpected Essbase error 1130610.=======================================

    The explanation of error 1130610 is the following:Possible Problems - Essbase cannot open a file. Possible Solutions - If you are using an error file, make sure that the error file is being created in a directory that already exists. Make sure you are using the ESSCMD IMPORT command correctly. Put all files the ESSCMD script needs in the $ARBORPATH\APP\applicationName\databaseName directory. Run the ESSCMD script from the $ARBORPATH\APP\applicationName\databaseName directory. Check the ESSCMD script for invalid paths. Make sure every folder that the script is pointing to exists. If you are using an error file, make sure that the error file is being created in a directory that already exists.

  • FDM event scripts firing twice during data loads

    Here's an interesting one. I have added the following to three different event scripts (one at a time, ensuring only one of these exists at any one time), to clear data before loading to Essbase:
    Event Script content:
    ' Declare local variables
    Dim objShell
    Dim strCMD
    ' Call MaxL script to run data clear calculation.
    Set objShell = CreateObject("WScript.Shell")
    strCMD = "D:\Oracle\Middleware\EPMSystem11R1\products\Essbase\EssbaseClient\bin\startMAXL.cmd D:\Test.mxl"
    API.DataWindow.Utilities.mShellAndWait strCMD, 0
    MaxL Script:
    login ******* identified by ******* on *******;
    execute calculation 'FIX("Member1","Member2") CLEARDATA "Member3"; ENDFIX' on *******.*******;
    exit;
    However, it appears that the clear is carried out twice, both before and after the data has been loaded to Essbase. This has been verified at each step by checking the Essbase application log:
    No event script:
    - No Essbase data clear in application log
    Adding above to "BefExportToDat" event script:
    - Script is executed once after clicking on Export in FDM Web Client (before the "Target System Load" modal popup is displayed). Entries are visible in Essbase Application log.
    - Script is then executed a second time when clicking on the OK button in the "Target System Load" modal popup. Entries are visible in Essbase Application log.
    Adding above to "AftExportToDat" event script:
    - Script is executed once after clicking on Export in FDM Web Client (before the "Target System Load" modal popup is displayed). Entries are visible in Essbase Application log.
    - Script is then executed a second time when clicking on the OK button in the "Target System Load" modal popup. Entries are visible in Essbase Application log.
    Adding above to "BefLoad" event script:
    - Script is NOT executed after clicking on Export in FDM Web Client (before the "Target System Load" modal popup is displayed).
    - Script is executed AFTER the data load to Essbase when clicking on the OK button in the "Target System Load" modal popup. Entries are visible in Essbase Application log.
    Some notes on the above:
    1. "BefExportToDat" and "AftExportToDat" are both executed twice, before and after the "Target System Load" modal popup. :-(
    2. "BefLoad" is executed AFTER the data is loaded to Essbase! :-( :-(
    Does anyone please have any idea how we might execute an Essbase database clear before loading data, and not after we have loaded fresh data? And perhaps on why the above event scripts appear to be firing twice?! There does not appear to be any logic to this!
    BefExportToDat - Essbase Application Log entries:
    +[Wed May 16 16:19:51 2012]Local/Monthly/Monthly/admin@Native Directory/140095859451648/Info(1013091)+
    +Received Command [Calculate] from user [admin@Native Directory]+
    +[Wed May 16 16:19:51 2012]Local/Monthly/Monthly/admin@Native Directory/140095859451648/Info(1013162)+
    +Received Command [Calculate] from user [admin@Native Directory]+
    +[Wed May 16 16:19:51 2012]Local/Monthly/Monthly/admin@Native Directory/140095859451648/Info(1012555)+
    +Clearing data from [Member3] partition with fixed members [Period(Member1); Scenario(Member2)]+
    +...+
    +[Wed May 16 16:20:12 2012]Local/Monthly/Monthly/admin@Native Directory/140095932016384/Info(1003037)+
    Data Load Updated [98] cells
    +[Wed May 16 16:20:12 2012]Local/Monthly/Monthly/admin@Native Directory/140095932016384/Info(1003024)+
    Data Load Elapsed Time : [0.52] seconds
    +...+
    +[Wed May 16 16:20:12 2012]Local/Monthly/Monthly/admin@Native Directory/140095930963712/Info(1013091)+
    +Received Command [Calculate] from user [admin@Native Directory]+
    +[Wed May 16 16:20:12 2012]Local/Monthly/Monthly/admin@Native Directory/140095930963712/Info(1013162)+
    +Received Command [Calculate] from user [admin@Native Directory]+
    +[Wed May 16 16:20:12 2012]Local/Monthly/Monthly/admin@Native Directory/140095930963712/Info(1012555)+
    +Clearing data from [Member3] partition with fixed members [Period(Member1); Scenario(Member2)]+
    AftExportToDat - Essbase Application Log entries:
    +[Wed May 16 16:21:32 2012]Local/Monthly/Monthly/admin@Native Directory/140095933069056/Info(1013091)+
    +Received Command [Calculate] from user [admin@Native Directory]+
    +[Wed May 16 16:21:32 2012]Local/Monthly/Monthly/admin@Native Directory/140095933069056/Info(1013162)+
    +Received Command [Calculate] from user [admin@Native Directory]+
    +[Wed May 16 16:21:32 2012]Local/Monthly/Monthly/admin@Native Directory/140095933069056/Info(1012555)+
    +Clearing data from [Member3] partition with fixed members [Period(Member1); Scenario(Member2)]+
    +...+
    +[Wed May 16 16:21:47 2012]Local/Monthly/Monthly/admin@Native Directory/140095930963712/Info(1003037)+
    Data Load Updated [98] cells
    +[Wed May 16 16:21:47 2012]Local/Monthly/Monthly/admin@Native Directory/140095930963712/Info(1003024)+
    Data Load Elapsed Time : [0.52] seconds
    +...+
    +[Wed May 16 16:21:47 2012]Local/Monthly/Monthly/admin@Native Directory/140095928858368/Info(1013091)+
    +Received Command [Calculate] from user [admin@Native Directory]+
    +[Wed May 16 16:21:47 2012]Local/Monthly/Monthly/admin@Native Directory/140095928858368/Info(1013162)+
    +Received Command [Calculate] from user [admin@Native Directory]+
    +[Wed May 16 16:21:47 2012]Local/Monthly/Monthly/admin@Native Directory/140095928858368/Info(1012555)+
    +Clearing data from [Member3] partition with fixed members [Period(Member1); Scenario(Member2)]+
    BefLoad - Essbase Application Log entries:
    +[Wed May 16 16:23:43 2012]Local/Monthly/Monthly/admin@Native Directory/140095932016384/Info(1013091)+
    +Received Command [Calculate] from user [admin@Native Directory]+
    +[Wed May 16 16:23:43 2012]Local/Monthly/Monthly/admin@Native Directory/140095932016384/Info(1013162)+
    +Received Command [Calculate] from user [admin@Native Directory]+
    +[Wed May 16 16:23:43 2012]Local/Monthly/Monthly/admin@Native Directory/140095932016384/Info(1012555)+
    +Clearing data from [Member3] partition with fixed members [Period(Member1); Scenario(Member2)]+
    +...+
    +[Wed May 16 16:23:44 2012]Local/Monthly/Monthly/admin@Native Directory/140095929911040/Info(1003037)+
    Data Load Updated [98] cells
    +[Wed May 16 16:23:44 2012]Local/Monthly/Monthly/admin@Native Directory/140095929911040/Info(1003024)+
    Data Load Elapsed Time : [0.52] seconds
    +...+
    +[Wed May 16 16:23:45 2012]Local/Monthly/Monthly/admin@Native Directory/140095860504320/Info(1013091)+
    +Received Command [Calculate] from user [admin@Native Directory]+
    +[Wed May 16 16:23:45 2012]Local/Monthly/Monthly/admin@Native Directory/140095860504320/Info(1013162)+
    +Received Command [Calculate] from user [admin@Native Directory]+
    +[Wed May 16 16:23:45 2012]Local/Monthly/Monthly/admin@Native Directory/140095860504320/Info(1012555)+
    +Clearing data from [Member3] partition with fixed members [Period(Member1); Scenario(Member2)]+

    Hi Larry,
    As mentioned, our exports do not appear to be generating the "-B.Dat" and "-C.Dat" files at present. However, you are correct with the Export and Load event scripts firing twice (once for the main TB file and again for the journal file). Does this also mean it could continue to fire an additional two times for the "-B.Dat" and "-C.Dat" files?
    On the last run, the output was as follows with the modified scripts:
    After clicking on Export in Workflow, the Target System Load modal popup is displayed, and the first two files have been generated:
    14.24.15.0527_BefExportToDat.txt
    14.24.17.0617_AftExportToDat.txt
    After clicking on OK in the Target System Load modal popup, the actual load to Essbase takes place. A further six files are generated:
    14.24.21.0289_BefLoad.txt
    14.24.22.0117_AftLoad.txt
    *14.24.22.0152_BefExportToDat-A.txt*
    *14.24.22.0414_AftExportToDat-A.txt*
    *14.24.22.0433_BefLoad-A.txt*
    *14.24.22.0652_AftLoad-A.txt*
    This makes a lot more sense, since one can see that the event scripts are being run a second time against the journal files during the data load. Many thanks, this solves my problem as I can now place my script where I want in the process chain. It's just a shame that there are not separate event scripts to distinguish between the various .Dat exports/loads, which are clearly occuring at separate times in the process chain.
    Many thanks! :-)
    P.S. Updated script below if anyone wishes to use it:
    Sub BefExportToDat(strLoc, strCat, strPer, strTCat, strTPer, strFile)
    Dim strF, fso, tf, t, temp, m, miliseconds, strSuf
    t = Timer
    temp = Int(t)
    m = Int((t-temp)*1000)
    miliseconds = String(4 - Len(m), "0") & m
    strF = "D:\TEST\" & Replace(Time, ":", ".") & "." & miliseconds & "_BefExportToDat"
    strSuf = UCase(Left(Right(strFile,6),2))
    If strSuf = "-A" Or strSuf = "-B" Or strSuf = "-C" Then
    strF = strF & UCase(strSuf) & ".txt"
    Else
    strF = strF & ".txt"
    End If
    Set fso = CreateObject("Scripting.FileSystemObject")
    Set tf = fso.CreateTextFile(strF, True)
    tf.WriteLine(strFile)
    tf.Close
    Set fso = Nothing
    End Sub

  • How to configure once data load then trigerd or run ibot?

    Hi Experts,
    I have a one requirement,
    1) Every day run one workflow( means data load into data warehouse)
    2) After, ibot should be run and delivery to users.
    3) We scheduled the workflows in DAC for every day morning.
    Requirement:
    Once data loaded, then IBot should be run and send to users dynamically (without scheduling).
    If workflow failed, IBot won’t be delivered.
    How to find out or configure once data load then trigerd or run ibot?
    I am using obi 10g and informatica 8 and os xp.
    Advance thanks..
    Thanks,
    Raja

    Hi,
    Below are the details for automating the OBIEE Scheduler.
    Create a batch file or Sh file with following command
    D:\OracleBI\server\Bin\saschinvoke -u Administrator/udrbiee007 -j 8
    -u is username/Password for Scheduler (username/password that u given while configuration)
    -j is a job id number when u create a ibot it will assign the new job number that can be identified from"Jobmanager"
    Refer the below thread for more information.
    iBot scheduling after ETL load
    Or ,
    What you the above also it will work but problem is we need specify the time like every day 6.30 am .
    Note: The condition report is true then the report will be delivered at 6.30 pm only but the condition is false the report will not triggered.
    I also implemented this but that is little bit different .
    Hope this help's
    Thanks
    Satya
    Edited by: Satya Ranki Reddy on Jul 13, 2012 12:05 PM

  • EDT Cat. 15 Data loaded to SAP but failed when updating

    Hi,
    I have an structure to load data into BP using KCLJ with Cat 15.
    Here it is.
    AKTYP
    TYPE
    PARTNER
    ROLE1
    KUNNR
    KUNNR_EXT
    BU_GROUP
    FIBUKRS
    CHIND_ADDR
    NAME_CO
    NAME_ORG1
    NAME_ORG2
    STREET
    STR_SUPPL1
    STR_SUPPL2
    STR_SUPPL3
    LOCATION
    HOUSE_NUM1
    POST_CODE1
    CITY1
    COUNTRY
    REGION
    LANGU
    CHIND_TEL
    TEL_NUMBER
    TEL_EXTENS
    We had the data loaded to SAP with ROLE 000000 and TR0100 but somehow when we tried to update the address thru ROLE TR0100, we had a dump and failed to update. However, the update is good when we use ROLE 000000 to update.
    Anyone would have a clue of this?
    Thanks

    I am new to EDT also. I know Cat 15 which can load BP. SAP has more categories which can load more using KCLJ. Here's my experience.
    Under Tcode SIMGH, find IMG structure External Data Transfer for SAP Banking. Under that structure, you can find Display Required and Optional Entry Fields for SEM Banking. When you enter 15 in the Category box, you'll see a whole list of fields about BP. You can use these fields to create your own structure.
    After you created your structure, use Define Sender Structure under External Data Transfer for SAP Banking to define your structure. After that, it is done. You can try use KCLJ to load your BP.
    If you still have other issues, mostly will be the configuration.
    Enjoy.

  • Further update to data targtes process type

    Hi all,
    Could anyone let me know what is the advantage of using the process type "further update to data target".Instead of this we can use the infopackge process type directly.
    please let me know
    Thanks,
    Manjula

    Purpose
    If you have loaded data into a DataStore object, you can use this DataStore object as the source for another InfoProvider. To do this, the data must be active. Use process chains to ensure that one process has ended before any subsequent processes are triggered.
           1.      Activating the DataStore object data: The data is in the activation queue. When you activate the data, the change log is filled with the data required for a delta update, and the data appears in the table of active data.
           2.      Updating the data to the connected InfoProviders: Using the transformation rules, the change log data (the delta) that has not yet been processed is updated to other InfoProviders. The data is already available in a cleansed and consolidated format.
    more at this link
    http://help.sap.com/saphelp_nw70/helpdata/en/12/43074208ae2a38e10000000a1550b0/frameset.htm
    hope it helps
    regards
    assign points if it is helpful

  • Database, Dataset, Table Adaptors Error "Unable to load, Update requires a valid DeleteCommand when passed DataRow collection with deleted row"

    Microsoft Visual Basic 2010 Express.
    I am new to Visual Basic programing and i am trying to understand the relationships between Datasets, database, table Adaptors. I have to following code that is is giving me the following error" Unable to load, Update requires a valid DeleteCommand
    when passed DataRow collection with deleted rows". 
    I can track the error and its located in "OffsetTableTableAdapter.Update(MaterionOffsetDataSet.OffsetTable)" code. What am i missing?
    It seems that i can delete the data on the DataGridView Table and it only displays the correct data. but my database is not updating, even though the data grid displays differently.I can determine this because, when i save the offset database, i have all
    the previous uploads and all the rows that i wanted to delete are still there.
    My final goal is to be able to import offset data from a CSV file, save this data on the pc, send a copy of this data to a NuermicUpDown so the customer can modify certain numbers. From here they download all the date to a controller.  IF the customer
    needs to modify the imported data, they can go to a tab with a data grid view and modify the table. They will also have to option to save the modified data into a csv file.  
    Im not sure if i am making this overcomplicated or if there is a easier way to program this.
    CODE:
    Private Function LoadOffSetData()
            Dim LoadOffsetDialog As New OpenFileDialog 'create a new open file dialog and setup its parameters
            LoadOffsetDialog.DefaultExt = "csv"
            LoadOffsetDialog.Filter = "csv|*.csv"
            LoadOffsetDialog.Title = "Load Offset Data"
            LoadOffsetDialog.FileName = "RollCoaterOffset.csv"
            If LoadOffsetDialog.ShowDialog() = Windows.Forms.DialogResult.OK Then  'show the dialog and if the result is ok then
                Try
                    Dim myStream As New System.IO.StreamReader(LoadOffsetDialog.OpenFile) 'try to open the file with a stream reader
                    If (myStream IsNot Nothing) Then 'if the file is valid
                        For Each oldRow As MaterionOffsetDataSet.OffsetTableRow In MaterionOffsetDataSet.OffsetTable.Rows
                            oldRow.Delete()                       
    'delete all of the existing rows
                        Next
                        'OffsetTableTableAdapter.Update(MaterionOffsetDataSet.OffsetTable)
                        Dim rowvalue As String
                        Dim cellvalue(25) As String
                        'Reading CSV file content
                        While myStream.Peek() <> -1
                            Dim NRow As MaterionOffsetDataSet.OffsetTableRow
                            rowvalue = myStream.ReadLine()
                            cellvalue = rowvalue.Split(","c) 'check what is ur separator
                            NRow = MaterionOffsetDataSet.OffsetTable.Rows.Add(cellvalue)
                            Me.OffsetTableTableAdapter.Update(NRow)
                        End While
                        Me.OffsetTableTableAdapter.Update(MaterionOffsetDataSet.OffsetTable)
                        MainOffset.Value = OffsetTableTableAdapter.MainOffsetValue          'saves all the table offsets
    to the offset numericUpDown registers in the main window
                        StationOffset01.Value = OffsetTableTableAdapter.Station01Value
                        StationOffset02.Value = OffsetTableTableAdapter.Station02Value
                       myStream.Close() 'close the stream
                        Return True
                    Else 'if we were not able to open the file then
                        MsgBox("Unable to load, check file name and location") 'let the operator know that the file wasn't able to open
                        Return False
                    End If
                Catch ex As Exception
                    MsgBox("Unable to load, " + ex.Message)
                    Return False
                End Try
            Else
                Return False
            End If
        End Function

    Hello SaulMTZ,
    >>I can track the error and its located in "OffsetTableTableAdapter.Update(MaterionOffsetDataSet.OffsetTable)" code. What am i missing?
    This error usually shows that you do not initialize the
    DeleteCommand object, you could check this
    article to see if you get a workaround.
    >> Im not sure if i am making this overcomplicated or if there is a easier way to program this.
    If you are working CSV file, you could use OleDB to read it which would treat the CSV file as a Table:
    http://www.codeproject.com/Articles/27802/Using-OleDb-to-Import-Text-Files-tab-CSV-custom
    which seems to be easier (in my opinion).
    Regards.
    We are trying to better understand customer views on social support experience, so your participation in this interview project would be greatly appreciated if you have time. Thanks for helping make community forums a great place.
    Click
    HERE to participate the survey.

  • Update types in infopackage for generic data source

    Hi,
    Suppose if we create a generic data source,then
    1) What are update types availbale in infopackage for tha generic t data source?
    2) Is there any link between,the selection on the
                               i)  type delta type select
                               ii) the field selected i.e like timestamp,cal month,numeric pointer.
    3)do we have delaa option at info package level?
    I think you understood my doubts.
    Regards
    Naresh.

    hi siva raju,
    yes our's is bi 7.0.but there are two DTP with the same optoion of DELTA.It pull the data from ds to info cube.
    one more thing,how are the delta records picked from the r/3 to bw?its through Infopackage Am i right?
    If possible please tell me the entire flow of data goes for a generic  data source.
    i.e
    how to load data?
    how to set delta?
    how do we load deltas to bw.?
    Regards
    Naresh.

Maybe you are looking for

  • I want a new firmware for the N93i!

    Please don't forget this mobile the N93i! I love this phone! I hope you can update a new firmware, Please. I want to have the same applications than the N95 for the foto camera and the video recorder, when somewhere it's a little dark the quality is

  • FLV and Navigation issue

    I have a lot of videos in the stage, and some of them that are loaded within a swf, so the problem is, when i press any button of the menu or something to navigate across the interface, the video it does not stop playing, it get stuck in the backgrou

  • Producer/Consumer-QSM

    Hi, This VI performs a instrument control. Some parameters need to be adjusted while VI is running. Producer/Comsumer design pattern is used to implement it. The producer catches the events of value change of parameters. The comsumer goes to correspo

  • Best browser for Lifedrive

    I've a PALM Lifedrive, with the Blazer ( v4.1 ) browser. I know that Blazer is now quite old, and it is not updated. So what is the best alternative browser (and recent), free oy paid, that i can use for the Lifredrive ? Thanks. Post relates to: Life

  • Simple example for web dynpro abap

    send me one simple and easiest example with screen shots in web dynpro abap......