CJ20N - project structure and data import

Hello dear SAP specialists,
Does anybody ever created a ABAP program to import project structure and populate it with data?
I'm trying to start with something or get a working program to import some projects with data.
Actually the program should use BAPI_PROJECT_MAINTAIN to create WBS and GUI_UPLOAD to read a text (cvs) file (LSMW can not be used:)
Sincerely,
Me

Hi,
Exactly!
See you have so many tables in that BAPI, now all the data you have in your flat file you have to map to these tables.
Below is a code...similarly you'll have to do
TYPE-POOLS: truxs.
*                           T Y P E S
TYPES: BEGIN OF t_master_data,
          MATERIAL                      Type  MATNR ,
          IND_SECTOR                    Type  MBRSH ,
          MATL_TYPE                     Type  MTART ,
          PLANT                         Type  WERKS ,
          STGE_LOC                      Type  LGORT_D ,
          MATL_DESC                     Type  MAKTX ,
          BASE_UOM                      Type  MEINS ,
          MATL_GROUP                    Type  MATKL ,
          DIVISION                      Type  SPART ,
          ITEM_CAT                      Type  MTPOS_MARA  ,
          GROSS_WT                      Type  BRGEW ,
          UNIT_OF_WT                    Type  GEWEI ,
          NET_WEIGHT                    Type  NTGEW ,
          VOLUME                        Type  VOLUM ,
          SIZE_DIM                      Type  GROES ,
          BASIC_MATL                    Type  WRKST ,
          DOCUMENT                      Type  DZEINR  ,
          DOC_VERS                      Type  DZEIVR  ,
          PO_UNIT                       Type  BSTME ,
          PUR_GROUP                     Type  EKGRP ,
          AUTO_P_ORD                    Type  KAUTB ,
          "BATCH_MGMT Type  XCHPF ,
          PUR_VALKEY                    Type  EKWSL ,
          "GR_PR_TIME Type  WEBAZ ,
          COMM_CODE                     Type  STAWN ,
          COUNTRYORI                    Type  HERKL ,
          MRP_TYPE                      Type  DISMM ,
          REORDER_PT                    Type  MINBE ,
          MRP_CTRLER                    Type  DISPO ,
          LOTSIZEKEY                    Type  DISLS ,
          MINLOTSIZE                    Type  BSTMI ,
          MAXLOTSIZE                    Type  BSTMA ,
          FIXED_LOT                     Type  BSTFE ,
          MAX_STOCK                     Type  MABST ,
          ROUND_VAL                     Type  BSTRF ,
          PROC_TYPE                     Type  BESKZ ,
          SPPROCTYPE                    Type  SOBSL ,
          ISS_ST_LOC                    Type  LGPRO ,
          SLOC_EXPRC                    Type  LGFSB ,
          PLND_DELRY                    Type  PLIFZ ,
          GR_PR_TIME                    Type  WEBAZ ,
          SM_KEY                        Type  FHORI ,
          SAFETY_STK                    Type  EISBE ,
          PLNG_PLANT                    Type  PRWRK ,
          AVAILCHECK                    Type  MTVFP ,
          DEP_REQ_ID                    Type  SBDKZ ,
          ISSUE_UNIT                    Type  AUSME ,
          STGE_BIN                      Type  LGPBE ,
          BATCH_MGMT                    Type  XCHPF ,
          STGEPERIOD                    Type  MAXLZ ,
          STGE_PD_UN                    Type  LZEIH ,
          MINREMLIFE                    Type  MHDRZ ,
          SHELF_LIFE                    Type  MHDHB ,
          PERIOD_IND_EXPIRATION_DATE    Type  DATTP ,
          ROUND_UP_RULE_EXPIRATION_DATE Type  RDMHD ,
          STOR_PCT                      Type  MHDLP ,
          QM_AUTHGRP                    Type  QMATAUTH  ,
          QM_PROCMNT                    Type  QMPUR ,
          CTRL_KEY                      Type  SSQSS ,
*           Type  ART ,
*           Type  AKTIV ,
          VAL_CAT                       Type  BWTTY_D ,
          VAL_CLASS                     Type  BKLAS ,
          PRICE_CTRL                    Type  VPRSV ,
* NEW ADDITION
          STD_PRICE                     Type  STPRS,
          PRICE_UNIT                    Type  PEINH ,
          MOVING_PR                     Type  VERPR ,
          QTY_STRUCT                    Type  CK_EKALREL  ,
          ORIG_GROUP                    Type  HRKFT ,
          ORIG_MAT                      Type  HKMAT ,
          VARIANCE_KEY                  Type  AWSLS ,
          PROFIT_CTR                    Type  PRCTR ,
          LANGU                         Type SPRAS,
       END OF t_master_data.
*                  I N T E R N A L   T A B L E S
DATA:
*     Internal table of type t_master_data
      ist_master_data TYPE TABLE OF t_master_data,
*     Internal table of type BAPIMATHEAD
      ist_headdata    TYPE TABLE OF BAPIMATHEAD,
*     Internal table of type BAPI_MAKT
      ist_mat_desc    TYPE TABLE OF BAPI_MAKT,
*     Internal table of type BAPI_MAKT
      ist_uom         TYPE TABLE OF BAPI_MARM,
*     Internal table of type BAPI_MAKTX
      ist_uom_x       TYPE TABLE OF BAPI_MARMX.
*                   G L O B A L   V A R I A B L E S
DATA:
      it_num     TYPE num10,
*     Global variable of type truxs_t_text_data
      it_raw     TYPE truxs_t_text_data.
*                       W O R K   A R E A S
DATA:
*     Work area of type t_master_data
      wa_master_data                TYPE t_master_data,
*     Work area of type bapimathead
      wa_bapimathead                TYPE BAPIMATHEAD,
*     Work area of type bapi_mara
      wa_client_data                TYPE BAPI_MARA,
*     Work area of type bapi_marax
      wa_client_data_x              TYPE  BAPI_MARAX,
*     Work area of type bapi_marc
      wa_plant_data                 TYPE BAPI_MARC,
*     Work area of type bapi_marcx
      wa_plant_data_x               TYPE BAPI_MARCX,
*     Work area of type bapi_mard
      wa_storage_location_data      TYPE BAPI_MARD,
*     Work area of type bapi_mardx
      wa_storage_location_data_x    TYPE BAPI_MARDX,
*     Work area of type bapi_mbew
      wa_valuation_data             TYPE BAPI_MBEW,
*     Work area of type bapi_mbewx
      wa_valuation_data_x           TYPE BAPI_MBEWX,
*     Work area of type bapi_mard
      wa_mat_desc                   TYPE BAPI_MAKT,
*     Work area of type bapi_marm
      wa_uom                        TYPE BAPI_MARM,
*     Work area of type bapi_marmx
      wa_uom_x                      TYPE BAPI_MARMX,
*     Work area of type mbapi_mpgd
      wa_planning_data              TYPE BAPI_MPGD,
*     Work area of type mbapi_mpgdx
      wa_planning_data_x            TYPE BAPI_MPGDX,
*     Work area of type mbapi_mpgd
      wa_return                     TYPE BAPIRET2.
*                          P A R A M E T E R S
PARAMETERS:
*     Parameter of type rlgrap-filename
      p_file TYPE  rlgrap-filename.
*               A T   S E L E C T I O N   S C R E E N
AT SELECTION-SCREEN ON VALUE-REQUEST FOR p_file.
  CALL FUNCTION 'F4_FILENAME'
    EXPORTING
      field_name = 'P_FILE'
    IMPORTING
      file_name  = p_file.
*                S T A R T - O F - S E L E C T I O N.
START-OF-SELECTION.
* To upload data from flat file
  CALL FUNCTION 'TEXT_CONVERT_XLS_TO_SAP'
    EXPORTING
      i_line_header        = 'X'
      i_tab_raw_data       = it_raw       " WORK TABLE
      i_filename           = p_file
    TABLES
      i_tab_converted_data = ist_master_data[]  "ACTUAL DATA
    EXCEPTIONS
      conversion_failed    = 1
      OTHERS               = 2.
  IF sy-subrc = 0.
    LOOP AT ist_master_data INTO wa_master_data.
      it_num = wa_master_data-PROFIT_CTR.
      wa_master_data-PROFIT_CTR = it_num.
      MOVE-CORRESPONDING wa_master_data to wa_bapimathead.
      MOVE-CORRESPONDING wa_master_data to wa_client_data.
      wa_client_data_x-MATL_GROUP = 'X'.
      wa_client_data_x-BASE_UOM = 'X'.
      wa_client_data_x-PO_UNIT = 'X'.
      wa_client_data_x-DOCUMENT = 'X'.
      wa_client_data_x-SIZE_DIM = 'X'.
      wa_client_data_x-BASIC_MATL = 'X'.
      wa_client_data_x-PUR_VALKEY = 'X'.
      wa_client_data_x-NET_WEIGHT = 'X'.
      wa_client_data_x-UNIT_OF_WT = 'X'.
      wa_client_data_x-DIVISION = 'X'.
      wa_client_data_x-BATCH_MGMT = 'X'.
      wa_client_data_x-QM_PROCMNT = 'X'.
      wa_client_data_x-MINREMLIFE = 'X'.
      wa_client_data_x-SHELF_LIFE = 'X'.
      wa_client_data_x-STOR_PCT = 'X'.
      wa_client_data_x-ROUND_UP_RULE_EXPIRATION_DATE = 'X'.
      wa_client_data_x-PERIOD_IND_EXPIRATION_DATE = 'X'.
      wa_client_data_x-ITEM_CAT = 'X'.
      MOVE-CORRESPONDING wa_master_data to wa_plant_data.
      wa_plant_data_x-PLANT  = wa_master_data-plant.
      wa_plant_data_x-PUR_GROUP = 'X'.
      wa_plant_data_x-ISSUE_UNIT = 'X'.
      wa_plant_data_x-MRP_TYPE = 'X'.
      wa_plant_data_x-MRP_CTRLER = 'X'.
      wa_plant_data_x-PLND_DELRY = 'X'.
      wa_plant_data_x-GR_PR_TIME = 'X'.
      wa_plant_data_x-LOTSIZEKEY = 'X'.
      wa_plant_data_x-PROC_TYPE = 'X'.
      wa_plant_data_x-SPPROCTYPE = 'X'.
      wa_plant_data_x-SAFETY_STK = 'X'.
      wa_plant_data_x-MINLOTSIZE = 'X'.
      wa_plant_data_x-MAXLOTSIZE = 'X'.
      wa_plant_data_x-FIXED_LOT = 'X'.
      wa_plant_data_x-ROUND_VAL = 'X'.
      wa_plant_data_x-MAX_STOCK = 'X'.
      wa_plant_data_x-DEP_REQ_ID = 'X'.
      wa_plant_data_x-SM_KEY = 'X'.
      wa_plant_data_x-STGEPERIOD = 'X'.
      wa_plant_data_x-STGE_PD_UN = 'X'.
      wa_plant_data_x-CTRL_KEY = 'X'.
      wa_plant_data_x-BATCH_MGMT = 'X'.
      wa_plant_data_x-AVAILCHECK = 'X'.
      wa_plant_data_x-AUTO_P_ORD = 'X'.
      wa_plant_data_x-COMM_CODE = 'X'.
      wa_plant_data_x-COUNTRYORI = 'X'.
      wa_plant_data_x-PROFIT_CTR = 'X'.
      wa_plant_data_x-ISS_ST_LOC = 'X'.
      wa_plant_data_x-VARIANCE_KEY = 'X'.
      wa_plant_data_x-SLOC_EXPRC = 'X'.
      wa_plant_data_x-QM_AUTHGRP = 'X'.
      MOVE-CORRESPONDING wa_master_data to wa_planning_data.
      wa_planning_data_x-PLANT = wa_master_data-plant.
      wa_planning_data_x-PLNG_PLANT = 'X'.
      MOVE-CORRESPONDING wa_master_data to wa_storage_location_data.
      wa_storage_location_data_X-PLANT = wa_master_data-plant.
      wa_storage_location_data_X-STGE_LOC = wa_master_data-stge_loc.
      wa_storage_location_data_X-STGE_BIN = 'X'.
      MOVE-CORRESPONDING wa_master_data to wa_valuation_data.
      wa_valuation_data-VAL_AREA = '1000'.
      wa_valuation_data_X-VAL_AREA = '1000'.
      wa_valuation_data_X-PRICE_CTRL = 'X'.
      wa_valuation_data_X-MOVING_PR  = 'X'.
      wa_valuation_data_X-PRICE_UNIT = 'X'.
      wa_valuation_data_X-STD_PRICE = 'X'.
      wa_valuation_data_X-VAL_CLASS = 'X'.
      wa_valuation_data_X-ORIG_GROUP = 'X'.
      wa_valuation_data_X-QTY_STRUCT = 'X'.
      wa_valuation_data_X-ORIG_MAT = 'X'.
      MOVE-CORRESPONDING wa_master_data to wa_mat_desc.
      APPEND wa_mat_desc to ist_mat_desc.
      IF wa_master_data-PRICE_CTRL = 'S' AND wa_master_data-STD_PRICE IS INITIAL.
        WRITE:/ 'Standard Price not maintained for material ',wa_master_data-MATERIAL.
      ELSE.
        CALL FUNCTION 'BAPI_MATERIAL_SAVEDATA'
          EXPORTING
            HEADDATA             = wa_bapimathead
            CLIENTDATA           = wa_client_data
            CLIENTDATAX          = wa_client_data_x
            PLANTDATA            = wa_plant_data
            PLANTDATAX           = wa_plant_data_X
            PLANNINGDATA         = wa_planning_data
            PLANNINGDATAX        = wa_planning_data_x
            STORAGELOCATIONDATA  = wa_storage_location_data
            STORAGELOCATIONDATAX = wa_storage_location_data_x
            VALUATIONDATA        = wa_valuation_data
            VALUATIONDATAX       = wa_valuation_data_x
          IMPORTING
            RETURN               = wa_return
          TABLES
            MATERIALDESCRIPTION  = ist_mat_desc.
        Write:/ wa_return-message.
        CALL FUNCTION 'BAPI_TRANSACTION_COMMIT'.
        Clear: wa_bapimathead
         ,wa_client_data
         ,wa_client_data_x
         ,wa_plant_data
        ,wa_plant_data_X
        ,wa_planning_data
         ,wa_planning_data_x
        ,wa_storage_location_data
         ,wa_storage_location_data_x
         ,wa_valuation_data
       ,wa_valuation_data_x.
      ENDIF.
      REFRESH ist_mat_desc.
    ENDLOOP.
  ELSE.
    Write:/ text-001.
  ENDIF.
In the above code note how i have taken the same names for the fields in internal table as they are in the BAPI so that the mapping of data becomes easier, since i use MOVE CORRESPONDING.
Regards,
Shraddha
Edited by: shraddha85 on Jan 31, 2011 9:49 AM

Similar Messages

  • Copying table structure and data.Please help

    Hi,
    I have a table called preview,with 100 million records in it.How can I copy the table structure and data into a table called temp_preview.
    Is it efficient to use
    create table temp_preview tablespace temp
    as select * from preview;Thanks

    Hi,
    If it is on the same database. That is the most efficient way.
    Regards
    Anurag Tibrewal.
    PS: You can try parallelism but this is not performance improvent as far as resource are concerned, it is just instead of one thread parallel thread would be working, so it just reduce the time the query takes to complete

  • Reports, Dashboard and Data Import Documentation

    Do you know where I can find documentation regarding Reports and Analytics, Dashboard and data import release 18?
    Thanks

    Click on Training and Support link and search for "*Release 18 Resources*".
    or click on this link for the resources
    http://download.oracle.com/docs/cd/E15799_01/homepage.htm

  • Structures and Data

    hello, i have a question. i think i know the answer already just want to make sure with you gurus. someone told me that structures don ot contain data. but i have seen code where people enhance structures that are on LBWE screen. so if they append structure where is the data kept? like i know there are underlying tables like vbap, vbak etc. ok. so the data comes from those tables in structure if i had to append the structures? then what happens? does it then go to setup table? or where does it go and what triggers the data to come into structures from those tables?  please help. thanks.  e.g. structure like MC02_0SGR etc.
    I did some search and found some stuff.so when user hits save button ok it saves the data in database table and also that record will be sent to LBWQ in logistics appalication case. ok. so if i append some structure and do some enhancement on it. let's say i add a field called quantity in it ok, and i do some coding. i bring the record in internal table and say something like MC02_0SGR-Quantity - Internal table-Quantity if some condition meet. ok so then. next time when user saves a record it will go straight to lets say vbak table and the same record will also go to lbwq. am i correct? at the time when user saved the record it didnot have that quantity field calculated because it has not gone through my code yet. right? i am just trying to understand the flow. i don't know if there is any documentation on this or not. can someone help me please? i am confused. thanks guys.
    Edited by: Vik1900 on Jun 13, 2009 2:28 PM

    Hi,
    If you enhance the DS in ECC..
    Eg: I enhanced XYZ- Structure and added ZQty field, and it will fetch the data from ABC Table using some code in CMOD, when users enter the data through some TCODE in ECC, the data wil store in base tables from there using Code we are fetching and extracting into BW. You can see the data in LBWQ/RSA7 in ECC, which contains the data for all fields which are there in Data Source.
    If you want to load data to BW, using setup tables, you must take some down time in ECC and load tha data..
    Steps:
    1. Take down Time in ECC
    2. Delete setup tables in ECC.
    3. Clear RSA7 in ECC, by running V3 job 2/3 times and loading deltas in BW.
    4. Check whethee it is cleared or not in SMQ1 and RSA7.
    5. Fill the setup tables.
    6. Load Init data in  BW.
    7. Set V3 job in ECC and Load deltas in BW.
    Thanks
    Reddy

  • Project structure and deployment

    Hello,
    We are working on an LCDS project using JBoss under Linux.
    1. It seems that no matter whether the project is to be
    compiled with Flex Builder or LCDS, the project folder is always
    created under the flex.war folder (choosing another location, e.g
    at the same level as flex.war triggers a warning in the Flex
    Builder project dialog). Is there a way to put it somewhere else ?
    2. We need to write some Java adapter classes for the
    RemoteObject. We created a separate Java project for this in
    MyEclipse. So, we have a Flex project and a Java project. We would
    like to create a single WAR file to include the output of both. Can
    you do this directly with MyEclipse or do you have to use Ant ?
    Thanks,
    Karl Sigiscar.

    Hello,
    We are working on an LCDS project using JBoss under Linux.
    1. It seems that no matter whether the project is to be
    compiled with Flex Builder or LCDS, the project folder is always
    created under the flex.war folder (choosing another location, e.g
    at the same level as flex.war triggers a warning in the Flex
    Builder project dialog). Is there a way to put it somewhere else ?
    2. We need to write some Java adapter classes for the
    RemoteObject. We created a separate Java project for this in
    MyEclipse. So, we have a Flex project and a Java project. We would
    like to create a single WAR file to include the output of both. Can
    you do this directly with MyEclipse or do you have to use Ant ?
    Thanks,
    Karl Sigiscar.

  • What's the easiest way to move app data and data structures to a server?

    Hi guys,
    I've been developing my app locally with Apex 4.2 and Oracle 11g XE on Windows 7. It's getting close to the time to move the app to an Oracle Apex server. I imagine Export/Import is the way to move the app. But what about the app tables and data (those tables/data like "customer" and "account" created specifically for the app)? I've been using a data modeling tool, so I can run a DDL script to create the data structures on the server. What is the easiest way to move the app data to the server? Is there a way to move both structures and data in one process?
    Thanks,
    Kim

    There's probably another way to get here, but, in SQL Developer, on the tree navigation, expand the objects down to your table, right click, then click EXPORT.. there you will see all the options. This is a tedious process and it sucks IMO, but yeah, it works. It sucks mostly because 1) it's one table at a time, 2) if your data model is robust and has constraints, and sequences and triggers, then you'll have to disable them all for the insert, and hope that you can re-enable constraints, etc without a glitch (good luck, unless you have only a handful of tables)
    I prefer using the oracle command line EXP to export an entire schema, then on the target server I use IMP to import the schema. That way, it's near exact. This makes life messy if you develop more than one application in a single schema, and I've felt that pain -- however -- it's a whole lot easier to drop tables and other objects than it is to create them! (thus, even if the process of EXP/IMP moved more than you wanted to "move".. just blow away what you don't want on the target after the fact..)
    You could use oracle's datapump method too.
    Alternatively, what can be done, IF you have access to both servers from your SQL developer instance (or if you can tnsping them both already from the command line, you can use SQL*PLUS), is run a script that will identify your apex apps' objects (usually by prefix to object names, like EBA_PROJ_%, etc) and do all the manual work for you. I've created a script that does exactly this so that I can move data from dev to prod servers over a dblink. It's tricky because of the order that must be executed to disable constraints and then re-enable them, and of course, trickier if you don't consistently prefix ALL of your "application objects"... (tables, views, triggers, sequences, functions, procs, indexes, etc)

  • Copying database objects and data from one server database to another server database in AG group

    Hi,
    I am still trying to wrap my head around sql clusters and AGs and I have a project that requires I take a vendor's database and restore it weekly so its available on the production server which is clustered.
    The vendor's database on the cluster is in an AG group and encrypted.
    Right now, I plan to restore the database on a sql staging server and use the SSIS Transfer SQL Server Objects Task to copy the table structure and data from Stage to the Production database of same name and I would first drop the objects in production
    database using the same task.
    I am concerned that this might cause issues with the passive cluster due to "logging" from active to passive. The database is about 260 MBs and I am not sure how many tables.
    Has anyone run into this type of scenario before or have a better solution?
    Thanks
    Sue

    IF I understand anything about clustered sql and logging, the sql server should take the log file and recreate the same scenario on the passive side of the cluster.
    Is that correct?
    Hi Sue,
    Yes, for AlwaysOn Availability Group, the transaction log is basically replayed from the primary to all of the secondary's.
    Besides, from my point of view, as we cannot directly restore a database that is part of an Availability Group, it is a good way using SSIS task to drop and recreate all tables then transfer data from the restored database to the primary replica. Schema changes
    and data changes will also happen on the secondary  replica.
    There are some similar links for your reference.
    http://dba.stackexchange.com/questions/21404/do-schema-changes-break-sql-server-2012-alwayson-or-are-they-handled-transpare
    http://blogs.msdn.com/b/sqlgardner/archive/2012/08/28/sql-2012-alwayson-and-backups-part-3-restore.aspx
    Thanks,
    Lydia Zhang
    If you have any feedback on our support, please click
    here.
    Lydia Zhang
    TechNet Community Support

  • How to create a project structure with the Business Bluprint transaction S

    Hi
    How to create a project structure and add the required scenarios for your SAP system to your project structure with the Business Bluprint transaction SOLAR01.
    Also how to add your SAP system configuration structures to your project structure with the configuration transaction SOLAR02
    Thanks,

    Dear fr
    Have u already created a project?
    select the same in solar01 and click the structure tab
    On the left side select business scenarios and on the right side select the structure tab
    Press F4 here and check the Busines process repository is selected
    Nw you hav all the standard business process
    For non-standard just type there name and press save button.
    Add the relevant one's
    Once added here in solar01 reflected in solar02 automatically with relevant data for standard business process.
    you hav to add manually for non stand scean you have added.
    Hope it clarifies.
    Pls assign pts.

  • Vertical tables and date tracked fields

    We are in process of remodeling our database. We will have true 3rd normal form tables with vertical structure and date tracked fields among many other changes. I'm interested in Toplink's support capability.
    We are evaluating using stored procedures to do CRUD operations and use toplink to do only reads. Has anyone used toplink in this type of database model and how is it implemented?
    Thanks

    Sure. Having your database in 3rd normal form will only make your object model more consistent and more efficient to modify.
    TopLink has great stored procedure support and also performs very favourably on reads, so it serves this kind of model quite well. The caveat is that TopLink tends to cache fairly agressively be default. If you are doing writes outside of TopLink then you will need to determine a strategy for refreshing cached objects that may have changed in the db. TopLink does provide a number of ways that you can use to do that, including cache eviction policies, implicit and explicit refreshing, and cache hit disabling when necessary.

  • Managing Org  units and Master Data in project structure

    How are Org units and Master data managed at the project structure? What would be the appropriate place holders for them in the structure. Should we manage it at the top of the project structure or within the  business scenarios/processes. On what basis should we decide this? How would this impact in case of a global template rollout?
    Please share your experiences regarding this.
    Thanks.
    Mike

    any talented  guy do this way.
    i think  that ur working  in E2E project work
    before  singoff ur business process u should give to ur users  master data templets..they collect for  masterdatas in before realization phase...that time  u should know  how many specifications is there and  in spections how  many in quantitative and how many qualitative u will indentify. if any query let me  back.
    Edited by: Lakshmiananda prasad on Oct 6, 2009 11:48 AM

  • I would like to import two different cf cards from two different cameras into the same project/folder and have them be in order of the times they were taken, is there a trick?

    I would like to import two different cf cards from two different cameras into the same project/folder and have them be in the order of the times they were taken, any ideas on how to do this?

    Just import them normally and sort the project by date. They will fall into place. If you tried this and it isn;t happening then make sure the data and times on the two cameras are identical and make sure you are sorting by date and time and nothing else.

  • Import Manager Usage : Approaches for developing Import file structure and text validations

    Hi Experts,
    We are having 50+ import maps. We have provided option to users to drop the files for data import. Currently Import Manager(7.1 Sp08) does not have capability of pre-import validation of file data such as
    a. file structure - number of columns specific to import map
    b. file text validations - special characters, empty lines, empty cells
    c. Uniqueness of the records in the file
    For this, we are planning to build temporary folder(port specific) in which user drops in the file. We use the custom development to do above mentioned validations and then move the files to actual import ports.
    Please let us know if you have worked on similar requirements and how you have fulfilled such requirements.
    Regards,
    Ganga

    Hi Ganga,
    Assuming you have a well defined xsd and are getting valid xmls from source in the Inbound Port of MDM. Also,you have a Primary key in form of External ID (say).
    So just by making and defining a XSD you get most of what you want in your questions a and b.
    Now if you wish to use PI to drop files in the inbound port then you can build all the validations in PI itself and you would not need Staging table.
    Otherwise,you can have another table (preferably Main table) in the same repository or other dummy repository where records are created on import based on External ID.
    Here you can launch an MDM workflow on import of these records and run assignments to replace unwanted characters and Validations to give error for rejecting some records based on the data quality level desired.Once unwanted characters are removed and data is validated it can be syndicated using a syndication step in the Workflow.So records which fail are not sent and which pass are sent to a outbound port.
    From the outbound port PI or some other job can pick the file from this outbound folder and drop to Inbound folder of the same repository which imports to the required Primary Main table.Here again you have the option to leverage validations in PI and further check if data is fine.
    Once this activity is done you can delete the records from the staging table.
    Thanks,
    Ravi

  • Can't find last import from iPhone! Noticed as I imported I had that view where there was only one project symbol, not all my projects showing by date, but even so pictures were imported. It took the time to do them, so where

    Can't find last import from iPhone!
    Noticed as I imported I had that view where there was only one project symbol, not all my projects showing by date, but even so pictures were imported. It took the time to do them, so where aare they? I have closed and restarted to get back fll view of all dated projects, but nothing can be found from today...

    have i posted it right...??  no response from apple guys...

  • Importing SQL Server Schema and Data

    What is the best way to go about importing schema objects (tables, contstraints, triggers, stored procs, etc.) and data from SQL Server to Oracle?

    No offence taken :)
    But I have copied data from SQL Server DB to Oracle using SQL Server Enterprise manager Data transformation services.
    About 30 tables with summary amount a few hundred thousand rows (~300K) took about 1.5 hour.
    Did it several times when tested and run for production in a data conversion project. Of course when copied to Oracle all data was processed to adapt for our new data model, but that is another story.
    It was quick and easy solution. OK if one needs to copy at least a few M rows then probably he needs to look for different tools.
    Gints Plivna
    http://www.gplivna.eu

  • Iptc and xmp, data important for organizing photos?

    HI, I keep encountering these two acronyms (IPTC and XMP) while working with my photos in Aperture.  Are these data important, should I be doing anything with it?  Essentially what I am doing is, after importing photos from iPhoto, I am filing to Folders, Projects, and Albums (some Smart). Thanks

    Astechman,
    EXIF (which you didn't ask about) is Exchangable Interchange Format, and those fields are generally physical attributes about your photos, as recorded by your camera.  Things like date/time, aperture, shutter speed, etc.  Some people like to think they should be able to change those, but that doesn't make any sense (except if your camera's clock is wrong.)
    IPTC (International Press Telecommunications Council) data is used in digital media as metadata for the author to put things.  These include things like keywords, location narratives, copyright notice, photographer name.  I.e., things that don't have anything to do with the camera or the physical attributes of the photo, but about the subject/content of the photo or the photographer.
    XMP adds onto IPTC, and is often associated with a "sidecar" file, in which a digital asset manager system (DAMS) saves extra metadata to such a file.  (Aperture is a DAMS, but it does not use sidecar files; it keeps data like that within the library.)
    As for what you should be doing with it -- that's up to you.  How much metadata do you want associated with your photos?  Fill in those fields and just ignore the rest.  I tend to fill in keywords and copyright, and that's about it, but there are many other fields that may be of interest to you.
    nathan

Maybe you are looking for

  • Blocked sales order is not released from VKM3

    Hi We are using credit management and when a sales order is created with the amount, which is more than credit limit, system is blocking the sales order for delivery. Values are getting updated in FD32 and S066 and S067. When i tried to release the s

  • How to connect printer wirelessly to Aiport Extreme?

    I have just installed an AirPort Extreme. It is fixed to the ceiling. The online instructions say to plug your printer into the USB port of the unit to register it, but I cannot reach the port now. Can it be registered wirelessly?

  • Sending data from applet to jsp

    hi all.... i m very new in java so please and please spend ur few movement for this topic. i want to run a jsp page from within an applet and evantually want to send data to jsp. the data is an array of string object or may be a properties object, sa

  • Is it safe to use the USB port in my car to charge my iPhone 5s or do I need to buy a car charger?

    Is it safe to use the USB port in my car to charge my iPhone 5s or do I need to buy a car charger?

  • EA3 - Meaning of Navigator Symbols

    There are a lot of different symbols inside the navigator panel. Is there a document, link or help function where I can see the meaning of those symbols? I couldn't find anything about that in the documentation or under Help. E.g. under the "Tables"