Tutorial for new Data Upload feature in 4.1?

Greetings,
Is there a tutorial available for using the new Data Upload feature in 4.1? We have not upgraded to 4.1 yet and I would like to try out the new feature on my APEX test workspace, so I can use the new feature once we are upgraded to 4.1.
Thanks,
John

I installed the Product Portal Sample Database and went to page 21. Very nice looking. Is there any tutorial for this Sample application? In other words, is there a tutorial that uses this application as its basis?
What I am really looking for (my apologies if I have not been clear) is a tutorial that steps you through the process of setting up the new feature of Data Upload in APEX 4.1. I would like to create a new application on my test workspace and learn how to set up the Data Upload page.
Seeing the Data Load in action is very helpful though. Thanks for pointing me to this.
Thanks,
John

Similar Messages

  • Period is locked for new data

    Good Morning,
    Anyone get this error before?
    -4013:Period is locked for new data
       at NetPoint.SynchSBO.SBOObjects.SBOOrder.NetPointToSBOOrder(NPOrder order)
       at NetPoint.SynchSBO.SBOObjects.SBOOrder.NetPointToSBO(NPQueueObject qData)
       at NetPoint.SynchSBO.SynchObjectBase.Synch()
    Haven't received it before today, of course, the day we go live.  The period we are posting to is not locked for new data and it seems to be limited to one customer only...  Any immediate thoughts?
    Thanks so much,
    Kristen

    OK, I have it working for the moment, but I am very curious about this.  I went back in and selected the Install Plugin button 2 more times just to be sure, I restarted everything and they went through fine.  Could it be that the Plugin installed partially before?  It didn't seem so because other orders were flowing both ways and just certain ones were seeming to get stuck...  Then more gradually started failing until none were passing through.  This has been working seemlessly in this environment for weeks until of course go live today, so I am at a loss.  Probably just that one piece of excitement for Go Live that we were missing.  All is well.  James, thanks for responding.

  • Period is locked for new data [Message 131-107]

    Hi all,
    One of my client was faced this problem "Period is locked for new data [Message 131-107]" when they do Period-End-Closing for Year 2008 in SAP Business One 2007A Patch 42.
    Can anyone help me?
    Thank you.
    Best regards,
    danny

    Hi Danny,
    Check the link
    Period End Closing
    *Close the thread if issue solved
    Regards
    Jambulingam.P

  • Period locked for new data on PO an SO

    Since last week when we try to update a PO anr SO we got this Message
    PERIODE LOCKED FOR NEW DATA.
    For exemple we change the remark feild only and we got this message.
    Since PO and SO are not impacting any gl entry this is not normal. on the DEMO DB i can update a SO or PO when the perio is close without any problem

    Hi Roy,
    First you have to change if your posting perioud is close.
    if close then choose your posting perioud then in your window left side a sign of triangle click on.
    after change ur status  and after u are able to do changes in previous posting perioud.
    this may help u...
    Thanks.
    JRAJPUT

  • Reg:Efficient solution for a data upload scenario

    Hi All,
            I have the following task.
             Required  data from a legacy system(generate  data only in the form of flat files)to SAP R3 as FB01 journals and the output file should be generated periodically(daily,weekly,fortnightly etc…)
    Solution Approaches:
    1)Write a BDC program to extract the data.
    2) Write a ABAP Progam to populate IDoc (if standard IDOc is available) or generate a Outbound proxy (If standard IDoc is not available) to push the data into SAP XI.
    Could anyone  tell me which would be the the best and efficient approach for this task and need your recommendations.
    Thanks in Advance.
    B.Lavanya
    Edited by: Lavanya Balanandham on Mar 31, 2008 2:23 PM

    Hi Lavanya,
    Required data from a legacy system(generate data only in the form of flat files)to SAP R3 as FB01 journals - use BDC for this thing because it will be better for large source files.
    the output file should be generated periodically(daily,weekly,fortnightly etc…)  - if this output file contains acknowledgment for the data uploaded by the above process, create a ABAP report for it and schedule it..........but if this output contains some other IDOC data which you need to send as a file to a third-party system, then go for SAP XI provided the IDOC data is not too large... but if the IDOC size is huge, then just create a ABAP report for outputting data to a file on application server and FTP the file to third-party system.
    Regards,
    Rajeev Gupta

  • Function module Vs BDC for master data upload

    Hi ,
    Please advice we should use the following function modules for master data upload or we should go for BDC.
    MP_RFC_SINGLE_CREATE
    MP_RFC_INACT_CHANGE
    MPLAN_CREATE
    MPLAN_CHANGE
    MPLAN_SET_DELETION_INDICATOR
    ASSET_MASTERRECORD_MAINTENANCE
    MPLAN_ITEM_CREATE
    MPLAN_ITEM_CHANGE
    GL_ACCT_MASTER_SAVE
    GL_ACCT_MASTER_SAVE
    GL_ACCT_MASTER_SAVE
    Actually, we have already used these function modules in our upload program, but we are not sure if these function modules will create any data inconsistency.
    Please let me know, if we should continue using the FMs, or there is any risk using the FMs and we should replace them by BDC.
    Thanks in advance.

    HI Vikram,
    Better to serch for the BAPI for uploading the master data.Becuase we have problems with BDC and FM's.
    If you use FM's it does n't contain all the fields which you want.IF you go for BDC this is not maintainable for future releaseas.IF you upgrade then screen may change.
    IF don' have any BAPI then better go for BDC.
    Thanks

  • LSMW used only for master data upload?

    Hi
    Can you please let me know if LSMW is used only for master data upload or we can also use it for transaction data ?

    Hi Christino.
    I have come across a standard SDN thread which deals with the uploading master data, refer it:
    [SDN Reference for uploading master data using LSMW|how can we upload master data by using LSMW;
    [SDN reference for which uploading is preferred (Master data or Transaction data)|Which one is better for uploading data LSMW or ECATT ?;
    Good Luck & Regards.
    HARSH

  • ERR:10003 Unexpected data store file exists for new data store

    Our TimesTen application crashes and then it can not connect TimesTen datastore, and then we use ttIsql and get error "10003 Unexpected data store file exists for new data store".So we must rebuild the DataStore.
    I guess the application damages the datastore because we use "direct-linked" mode. Is it true?
    Should I use "Client-Server" mode if our data is very important?
    thx!

    Your question raises several important discussion points:
    It is possible (though very unlikely in practice) for a C or C++ program operating in direct mode to damage the contents of the datastore e.g. by writing through an invalid memory pointer. In the 11+ years that TimesTen has existed as a commercial product we have so far never seen any support case where this was diagnosed as the cause of a problem. However, it is definitely a theoretical possibility and rigorous program testing and use of tools such as Purify is strongly recommended when developing in C or C++ in direct mode. Java programs running in direct mode are completely 'safe' unless they invoke non-Java code via JNI when a similar risk is present.
    The reality is that most customers who use TimesTen in very high performance mission critical applications use mainly direct mode...
    Note also that an application crashing should not cause any damage or corruption to a datastore, even if it is using direct mode, as Times%Ten contains explicit mechanisms to guard against this.
    Your specific problem (error 10003) is nothing to do with the datastore being damaged. This error reflects a discrepancy between the instance main daemon's metedata about all the datastores that it is managing and the reality. This error occurs when the main daemon does not know about a datastore and yet when it comes to connect to (and hence create) the datastore it finds that checkpoint or log files already exist. The main daemon's metadata is managed solely by the main daemon and is completely separate from the datastore and datastore files (the default location is <tt_instance_install_directory>/info, though you can change this at install time). The ususal cause of this is that someone has been manually manipulating files within that directory (which of course you should never do) and has removed or renamed the .DBI file corresponding to the datastore.
    This error should never arise under normal circumstances and certainly not just because some application has crashed.
    Rather than simply switching to the (much slower) client/server mode I think we should try and understand why this error is occurring. Could you please post the following:
    1. Output of ttVersion command
    and then we can take it from there.
    Thanks, Chris

  • Maintain number range interval for master data upload for existing employee

    Hi  Experts,
    1)I have the scenario to upload additional data for existing employees in PA. Employees are already existed in SAP HR , But additional Infotypes are required to maintain for those emploees
    2) I have the senario where i have to upload master data for new employees.
    Pls give the detailed description of how to maintain number range interval i.e. External or Internal for upload in both the above scenario. Do we have to maintain number range manually in master data record & then upload it Through BDC or LSMW?
    << Moderator message - Everyone's problem is important. But the answers in the forum are provided by volunteers. Please do not ask for help quickly. >>
    Edited by: Rob Burbank on Jan 12, 2011 3:49 PM

    >
    s c patil wrote:
    > 2) For new employees i have to maintain desired ( my or client?) number range in SAP system as External number range & then default that number range in NUMKAR & then maintain those number range in master data record & then get the data template filled by client & then upload the data & after that create new number range which is next to existing External number range as an  Internal number range. & then default that Internal number range.
    >
    > Pls reply ASAP
    Yes Mr. Patil...
    For existing employees
    you need to execute HIRING ACTION Through BDC with External number range. While recording you have to use atleast three infotype i.e. IT0000,IT0001,IT0002. In addition you can upload other infotype through PA30.
    For new employees
    While configuration you can create another number range as internal  for new hiring. and use NUMKR feature as well.
    Here I don't understand that why r u looking for upload process for new hiring, if it is not mass hiring. It should be day to day activities which would be done by user through PA40.
    Best Regards,
    Anand Singh

  • Optimization for bulk data upload

    Hi everyone!
    I've got the following issue:
    I have to do a bulk data upload using JMS deploy in a glassfish 2.1, to process and validate data in a Oracle 10g DB System before it is insert.
    I have my web interface that loads a file and then delegates the process to a Stateless Session Bean which read a N number of lines and after that send a message to a JMS Queue. The JMS has to parse the each line, validate them with the data already in the DB, and finally persist the new data.
    This process is high process consuming, and I need to improve the performance time. I tried to change glassfish default JMS and JDBC pool size, but I have no a huge difference.
    Do you have any advice that could help me?
    Thanks in advance!

    Hi! thank you for you answer!
    High process consuming is in the MDB
    I'm grouping each N number of read lines in the EJB and then send the message to the JMS. The MDB process the persists each line as info in different related tables.
    Thanks again!

  • Discrepency while REPLICATING META DATA (for new DATA SOURCE) in BI side.

    In R/3 I have created a simple TRANSACTIONAL data store based on an INFO QUERY.
    I even checked the veracity of this data store using RSA3 to see whether it extracts data. Works perfectly.
    I go to BW side. Click on the SAP module under which I have created the data source, right click, select 'REPLICATE' and click on it.
    ( I would love to post the screen shot here, but I think I may not be able to paste BMP files here).
    I will write the contents of the POP-UP that appears,
    Title:Data Source from Source System Unknown
    Pop-up contents:
    Data Source (OSOA) DS_01
    does not exist in BI system
    How do you want to create the object in BI?
    1. as DataSource (RSDS)
    2. as 3.x DataSource (ISFS)
    3. this and following 3 as DataSource (RSDS)
    4. this and following 3 as 3.x DataSource (ISFS).
    Well, I normally select option three as per my instructions (without knowing the real reason).
    But sometimes, either for the same data source or for another data sources that I created, the same pop up will appear like this
    Title:Data Source from Source System Unknown
    Pop-up contents:
    Data Source (OSOA) DS_01
    does not exist in BI system
    How do you want to create the object in BI?
    1. as DataSource (RSDS)
    2. as 3.x DataSource (ISFS)
    Just TWO options.
    And, if I select option 1, the data source does not work properly on BI side, though it worked perfectly on R/3 under TRANSACTION RSA3 and showed me data.
    For some unknown reasons, if I delete the erroneous datasource on BI side and sleep overnight and comeback in the morning and replicate, the POP-UP sometimes appears with FOUR options, (Notice the word 'SOMETIMES')
    Can someone explain the secret behind this?
    Thanks again in advance,
    Gold

    3. this and following 3 as DataSource (RSDS)
    That means there are total 3 new (not yet in BI) DataSources available, u wanted to replicate as 7.0 datasource (RSRS) or 3.x datasource (ISFS).
    (other 2 datasources activated from RSA5, or created by other users under that SAP module)
    If there is 1 new DataSource, u will  get just TWO options.
    1. as DataSource (RSDS)
    2. as 3.x DataSource (ISFS)
    After replication with option 1, u should activate datasource in BI, then create infopackages, transformation, DTP etc.

  • Swap out dataset for new data possible?

    If I use the same column names in my underlying database tables, is it possible to use them in an existing report to refresh data for new quarter without wiping out my existing reports?  The reports took a long time to set up so I hope I can do this.
     If it is, can you please point me in the right direction?  
    Thanks in advance.  I think is is probably a basic concept but I'm having a mental block.

    Hi ,
      I dont think it is possible to append the data to existing report . Once the data rendered to the report , the data is pretty much static and if you want new data to be shown up you need to refresh and re-render the report. Alteranatively , if your
    report is taking long time to render consider caching the reports
    http://msdn.microsoft.com/en-IN/library/ms155927.aspx
    Best Regards Sorna

  • Query for monitor data upload

    Hi, Experts
        Normally in Cube we just have requestID, which only has number information and nothing else( request date, time, selection, type of data upload ... )
        Can I make a Bex query show information just like Cube manage?  becase we had to check whether there is duplicated selection request is not deleted or some missing request in case multi-datasource to one cube
        I can not find any useful query in BW statistics queries.
    thanks in advance.

    I'm also can not found enough information from table RSMONICDP
    In our case, Cube 0COOM_C02 have lots infosources, some are full upload and some are Delta upload. all of inforpackage are scheduled one process chain.
    then I go to log of this process chain, I found some error happened in some days, so some time the process chain is not finished, so that's means in Cube 0COOM_C02 have missing request and duplicated request.
    I'm hard to using cube-manage to found all of problem request because there are so many request and so little windowns.  so my question is, is there any Bex query or BW table can indicate similiar information within cube - manage - request tab.
    so I can analysis them in Excel, it's quict easy for me.
    thank you all

  • Xls. sheet for Master data upload

    Hi Can any body suggest me or send me the sample, how to maintain xls. sheet Template of  particular fields for Infotypes 0,1,2,7,8 etc for the purpose of data upload.
    <removed by Moderator>
    thanks
    S Mishra

    Hi Mishra,
    You can look into the Standard Business Blueprint Templates - Data Transfer Tool of SAP....and get an idea...
    Check this Note 1060029 - SAP Best Practices for HCM US - Variants, Misc, LSMW ...
    You will find files for infotypes and you can set up your templates based on these....
    Other way is to
    Go to SE11 >> Enter the Infotype Number PNNNN (with NNNN being the Infotype)...
    It presents you with the structure.......Copy that structure and you can create your excel sheet with that template....
    Good Luck !!!!
    Kumarpal Jain.

  • Issue with DSO Table for New data

    When I click on content of new data in the manage of this particular DSO it hangs forever. Also, when I go to se11 and give the related New Data table, it ahngs forever.
    Anybody experienced this before or what would this issue be related to?
    Thanks in advance

    Ask your DBA to see if h/she is able to access the same table from the backend...
    When does it hang ?
    When displaying the table structure in SE11 or when going to the first selection screen from SE11 or afterwards..?

Maybe you are looking for

  • How to export signatures and preferences

    Hello, I am due to upgrade a number of laptops from 7 Pro to 7 Enterprise, so full format and re-build unfortunatley.  Is there any way to export the signatures used to sign docs out of Acrobat Pro?  I am told that current signatures will also need t

  • My Mac still thinks I have MobileMe

    I'm running 10.7.4, and I'm trying to enable iCloud on my Mac to work with my iPhone.  When I click on iCloud in Preferences, I'm prompted to convert my MobileMe account to iCloud, which I can't do because it's no longer available.  How do I bypass t

  • Software download shows code dump and hangs

    Customer has reported that when trying to download the Oracle Business Process Management 10gR3 (10.3.2) Standalone software for Solaris Operating System (SPARC 64-bit) the browser shows a code dump and downloads hangs. http://www.oracle.com/technolo

  • User exit for Tot. repl. lead time

    Dear all My client got around 30 plants and Tot. repl. lead time they maintain for each material like 2 days to 20 days this is what is considered for delivery correct me if i am wrong . now they want to fix the delivery days like 7 days or 10 days f

  • Trouble internet-pictures

    Hi, When I want to upload or download pictures from the internet with Safari or Firefox evertything slows down. It takes an extreme long time to send pictures to my Typepadblog or download pictures from any site. And sometimes the browser crashes. I