Golden Gate for flat file

hi,
I have tried with GoldenGate for Oracle/ non-Oracle databases. Now, I am trying for flat file.
What i have done so far:
1. I have downloaded Oracle "GoldenGate Application Adapters 11.1.1.0.0 for JMS and Flat File Media Pack"
2. Kept it on the same machine where Database and GG manager process exists. Port for GG mgr process 7809, flat file 7816
3. Following doc GG flat file administrators guide Page 9 --> configuration
4. Extract process on GG manager process_
edit params FFE711*
extract ffe711
userid ggs@bidb, password ggs12345
discardfile E:\GoldenGate11gMediaPack\V26071-01\dirrpt\EXTFF.dsc, purge
rmthost 10.180.182.77, mgrport 7816
rmtfile E:\GoldenGate11gMediaPack\V26071-01\dirdat\ffremote, purge, megabytes 5
add extract FFE711, EXTTRAILSOURCE ./dirdat/oo*
add rmttrail ./dirdat/pp, extract FFE711, megabytes 20*
start extract  FFE711*
view report ffe711*
Oracle GoldenGate Capture for Oracle
Version 11.1.1.1 OGGCORE_11.1.1_PLATFORMS_110421.2040
Windows (optimized), Oracle 11g on Apr 22 2011 03:28:23
Copyright (C) 1995, 2011, Oracle and/or its affiliates. All rights reserved.
Starting at 2011-11-07 18:24:19
Operating System Version:
Microsoft Windows XP Professional, on x86
Version 5.1 (Build 2600: Service Pack 2)
Process id: 4628
Description:
** Running with the following parameters **
extract ffe711
userid ggs@bidb, password ********
discardfile E:\GoldenGate11gMediaPack\V26071-01\dirrpt\EXTFF.dsc, purge
rmthost 10.180.182.77, mgrport 7816
rmtfile E:\GoldenGate11gMediaPack\V26071-01\dirdat\ffremote, purge, megabytes 5
CACHEMGR virtual memory values (may have been adjusted)
CACHEBUFFERSIZE: 64K
CACHESIZE: 1G
CACHEBUFFERSIZE (soft max): 4M
CACHEPAGEOUTSIZE (normal): 4M
PROCESS VM AVAIL FROM OS (min): 1.77G
CACHESIZEMAX (strict force to disk): 1.57G
Database Version:
Oracle Database 11g Enterprise Edition Release 11.1.0.7.0 - Production
PL/SQL Release 11.1.0.7.0 - Production
CORE 11.1.0.7.0 Production
TNS for 32-bit Windows: Version 11.1.0.7.0 - Production
NLSRTL Version 11.1.0.7.0 - Production
Database Language and Character Set:
NLS_LANG environment variable specified has invalid format, default value will b
e used.
NLS_LANG environment variable not set, using default value AMERICAN_AMERICA.US7A
SCII.
NLS_LANGUAGE = "AMERICAN"
NLS_TERRITORY = "AMERICA"
NLS_CHARACTERSET = "AL32UTF8"
Warning: your NLS_LANG setting does not match database server language setting.
Please refer to user manual for more information.
2011-11-07 18:24:25 INFO OGG-01226 Socket buffer size set to 27985 (flush s
ize 27985).
2011-11-07 18:24:25 INFO OGG-01052 No recovery is required for target file
E:\GoldenGate11gMediaPack\V26071-01\dirdat\ffremote, at RBA 0 (file not opened).
2011-11-07 18:24:25 INFO OGG-01478 Output file E:\GoldenGate11gMediaPack\V2
6071-01\dirdat\ffremote is using format RELEASE 10.4/11.1.
** Run Time Messages **
5. on Flat file GGSCI prompt-->_*
edit params FFR711*
extract ffr711
CUSEREXIT E:\GoldenGate11gMediaPack\GGFlatFile\V22262-01\flatfilewriter.dll CUSEREXIT passthru includeupdatebefores, params "E:\GoldenGate11gMediaPack\GGFlatFile\V22262-01\sample-dirprm\ffwriter.properties"
SOURCEDEFS E:\GoldenGate11gMediaPack\V26071-01\dirdef\vikstkFF.def
table ggs.vikstk;
add extract ffr711, exttrailsource ./dirdat/pp*
start extract ffr711*
view report ffr711*
Oracle GoldenGate Capture
Version 11.1.1.0.0 Build 078
Windows (optimized), Generic on Jul 28 2010 19:05:07
Copyright (C) 1995, 2010, Oracle and/or its affiliates. All rights reserved.
Starting at 2011-11-07 18:21:31
Operating System Version:
Microsoft Windows XP Professional, on x86
Version 5.1 (Build 2600: Service Pack 2)
Process id: 5008
Description:
** Running with the following parameters **
extract ffr711
CUSEREXIT E:\GoldenGate11gMediaPack\GGFlatFile\V22262-01\flatfilewriter.dll CUSE
REXIT passthru includeupdatebefores, params "E:\GoldenGate11gMediaPack\GGFlatFil
e\V22262-01\sample-dirprm\ffwriter.properties"
E:\GoldenGate11gMediaPack\GGFlatFile\V22262-01\ggs_Windows_x86_Generic_32bit_v11
_1_1_0_0_078\extract.exe running with user exit library E:\GoldenGate11gMediaPac
k\GGFlatFile\V22262-01\flatfilewriter.dll, compatiblity level (2) is current.
SOURCEDEFS E:\GoldenGate11gMediaPack\V26071-01\dirdef\vikstkFF.def
table ggs.vikstk;
CACHEMGR virtual memory values (may have been adjusted)
CACHEBUFFERSIZE: 64K
CACHESIZE: 1G
CACHEBUFFERSIZE (soft max): 4M
CACHEPAGEOUTSIZE (normal): 4M
PROCESS VM AVAIL FROM OS (min): 1.87G
CACHESIZEMAX (strict force to disk): 1.64G
Started Oracle GoldenGate for Flat File
Version 11.1.1.0.0
** Run Time Messages **
Problem I am facing_
I am not sure where to find the generated flat file,
even the reports are showing there is no data at manager process
I am expecting replicat instead of extract at Flatfile FFR711.prm
I have done this much what to do give me some pointers.....
Thanks,
Vikas

Ok, I haven't run your example, but here are some suggestions.
Vikas Panwar wrote:
extract ffe711
userid ggs@bidb, password ggs12345
discardfile E:\GoldenGate11gMediaPack\V26071-01\dirrpt\EXTFF.dsc, purge
rmthost 10.180.182.77, mgrport 7816
rmtfile E:\GoldenGate11gMediaPack\V26071-01\dirdat\ffremote, purge, megabytes 5
ggsci> add extract FFE711, EXTTRAILSOURCE ./dirdat/oo
ggsci> add rmttrail ./dirdat/pp, extract FFE711, megabytes 20
ggsci> start extract  FFE711
You of course need data captured from somewhere to test with. You could capture changes directly from a database and write those to a trail, and use that as a source for the flat-file writer; or, if you have existing trail data, you can just use that (I often test with old trails, with known data).
In your example, you are using a data pump that is doing nothing more than pumping trails to a remote host. That's fine, if that's what you want to do. (It's actually quite common in real implementations.) But if you want to actually capture changes from the database, then change "add extract ... extTrailSource" to be "add extract ... tranlog". I'll assume you want to use the simple data pump to send trail data to the remote host. And I will assume that some other database capture process is creating the trail dirdat/oo
Also... with your pump "FFE711", you can create either a local or remote trial, that's fine. But don't use a rmtfile (or extfile). You should create a trail, either a "rmttrail" or "exttrail". The flat-file adapter will read that (binary) trail, and generate text files. Trails automatically roll-over, the "extfile/rmtfile" do not (but they do have the same internal GG binary log format). (You can use a 'maxfiles' to force them to rollover, but that's beside the point.)
Also, <ul>
<li> don't forget your "table" statements... or else no data will be processed!! You can wildcard tables, but not schemata.
<li> there is no reason that anything would be discarded in a pump.
<li> although a matter of choice, I don't see why people use absolute paths for reports and discard files. Full paths to data and def files make sense if they are on the SAN/NAS, but then I'd use symlinks from dirdat to the storage directory (on Unix/Linux)
<li> both windows and unix can use forward "/" slashes. Makes examples platform-independent (another reason for relative paths)
<li> your trails really should be much larger than 5MB for better performance (e.g,. 100MB)
<li> you probably should use a source-defs file, intead of a dblogin for metadata. Trail data is by its very nature historical, and using "userid...password" in the prm file inherently gets metadata from "right now". The file-writer doesn't handle DDL changes automatically.
</ul>
So you should have something more like:
Vikas Panwar wrote:
extract ffe711
sourcedefs dirdef/vikstkFF.def
rmthost 10.180.182.77, mgrport 7816
rmttrail dirdat/ff, purge, megabytes 100
table myschema.*;
table myschema2.*;
table ggs.*;For the file-writer pump:
+5. on Flat file GGSCI prompt+
extract ffr711
CUSEREXIT flatfilewriter.dll CUSEREXIT passthru includeupdatebefores, params dirprm\ffwriter.properties
SOURCEDEFS dirdef/vikstkFF.def
table myschema.*;
table ggs.*;
ggsci> add extract ffr711, exttrailsource ./dirdat/pp
ggsci> start extract ffr711
Again, use relative paths when possible (the flatfilewriter.dll is expected to be found in the GG install directory). Put the ffwriter.properties file into dirprm, just as a best-practice. In this file, ffwriter.properties, is where you define your output directory and output files. Again, make sure you have a "table" statement in there for each schema in your trails.
Problem I am facing_
I am not sure where to find the generated flat file,
even the reports are showing there is no data at manager process
I am expecting replicat instead of extract at Flatfile FFR711.prm
I have done this much what to do give me some pointers.....The generated files are defined in the ffwriter.properties file. Search for "rootdir" property, e.g.,
goldengate.flatfilewriter.writers=csvwriter
csvwriter.files.formatstring=output_%d_%010n
csvwriter.files.data.rootdir=dirout
csvwriter.files.control.ext=_data.control
csvwriter.files.control.rootdir=dirout
...The main problem you have is: (1) use rmttrail, not rmtfile, and (2) don't forget the "table" statement, even in a pump.
Also, for the flat-file adapter, it does run in just a "extract" data pump; no "replicat" is ever used. The replicats inherently are tied to a target database; the file-writer doesn't have any database functionality.
Hope it helps,
-m

Similar Messages

  • Golden Gate for mysql5.5 extract is Abended,and not error in the file

    Dear All,
    golden gate for mysql5.5 to oracle 11g,extract is Abended ,but there didn't have error in the log , And sometimes the successful extraction some records;
    extract :
    EXTRACT EXT_M1               
    TRANLOGOPTIONS AltLogDest /mydata/mysqllog/binlog/binlog.index       
    SOURCEDB [email protected]:16052, USERID mama,PASSWORD mama        
    sqlexec "set names gbk;"       
    EXTTRAIL dirdat/m1                  
    Dynamicresolution               
    TABLE mama.merchants_member_card_customer;   
    datapump:
    EXTRACT DPRD_M1  
    SOURCEDB [email protected]:16052, USERID mama,PASSWORD mama  
    RMTHOST 192.168.2.57, MGRPORT 7089, compress --COMPRESSUPDATESETWHERE
    RMTTRAIL /home/oracle/goldengate/dirdat/m1
    NOPASSTHRU  
    TABLE mama.merchants_member_card_customer;
    GGSCI>>info all
    Program     Status      Group       Lag at Chkpt  Time Since Chkpt
    MANAGER     RUNNING                                          
    EXTRACT     RUNNING     DPRD_M1     00:00:00      00:00:01   
    EXTRACT     ABENDED     EXT_M1      00:11:49      00:01:56
    REPORT:
    GGSCI>>view report ext_m1
                      Oracle GoldenGate Capture for MySQL
          Version 11.2.1.0.1 OGGCORE_11.2.1.0.1_PLATFORMS_120423.0230
    Linux, x64, 64bit (optimized), MySQL Enterprise on Apr 23 2012 05:23:34
    Copyright (C) 1995, 2012, Oracle and/or its affiliates. All rights reserved.
                        Starting at 2013-09-29 18:38:08
    Operating System Version:
    Linux
    Version #1 SMP Wed Jun 13 18:24:36 EDT 2012, Release 2.6.32-279.el6.x86_64
    Node: M46
    Machine: x86_64
                             soft limit   hard limit
    Address Space Size   :    unlimited    unlimited
    Heap Size            :    unlimited    unlimited
    File Size            :    unlimited    unlimited
    CPU Time             :    unlimited    unlimited
    Process id: 6322
    Description:
    **            Running with the following parameters                  **
    2013-09-29 18:38:08  INFO    OGG-03035  Operating system character set identified as UTF-8. Locale: zh_CN, LC_ALL:.
    EXTRACT EXT_M1
    TRANLOGOPTIONS AltLogDest /mydata/mysqllog/binlog/binlog.index
    SOURCEDB [email protected]:16052, USERID mama100,PASSWORD ****************
    sqlexec "set names gbk;"
    Executing SQL statement...
    2013-09-29 18:38:08  INFO    OGG-00893  SQL statement executed successfully.
    EXTTRAIL dirdat/m1
    Dynamicresolution
    TABLE mama100.merchants_member_card_customer;
    2013-09-29 18:38:08  INFO    OGG-01815  Virtual Memory Facilities for: COM
        anon alloc: mmap(MAP_ANON)  anon free: munmap
        file alloc: mmap(MAP_SHARED)  file free: munmap
        target directories:
        /home/goldengate/dirtmp.
    CACHEMGR virtual memory values (may have been adjusted)
    CACHESIZE:                               64G
    CACHEPAGEOUTSIZE (normal):                8M
    PROCESS VM AVAIL FROM OS (min):         128G
    CACHESIZEMAX (strict force to disk):     96G
    Database Version:
    MySQL
    Server Version: 5.5.24-patch-1.0-log
    Client Version: 6.0.0
    Host Connection: 192.168.2.46 via TCP/IP
    Protocol Version: 10
    2013-09-29 18:38:08  INFO    OGG-01056  Recovery initialization completed for target file dirdat/m1000000, at RBA 1295, CSN 000086|000000065228677.
    2013-09-29 18:38:08  INFO    OGG-01478  Output file dirdat/m1 is using format RELEASE 11.2.
    2013-09-29 18:38:08  INFO    OGG-01026  Rolling over remote file dirdat/m1000000.
    2013-09-29 18:38:08  INFO    OGG-00182  VAM API running in single-threaded mode.
    2013-09-29 18:38:08  INFO    OGG-01515  Positioning to begin time 2013-9-29 06:26:18.
    **                     Run Time Messages                             **
    2013-09-29 18:38:08  INFO    OGG-01516  Positioned to Log Number: 86
        Record Offset: 65223906, 2013-9-29 06:26:18.
    2013-09-29 18:38:08  INFO    OGG-01517  Position of first record processed Log Number: 86
        Record Offset: 65223906, 2013-9-29 06:26:18.
    TABLE resolved (entry mama100.merchants_member_card_customer):
      TABLE mama100."merchants_member_card_customer";
    Using the following key columns for source table mama100.merchants_member_card_customer: id.
    2013-09-29 18:38:08  INFO    OGG-01054  Recovery completed for target file dirdat/m1000001, at RBA 1316, CSN 000086|000000065228677.
    2013-09-29 18:38:08  INFO    OGG-01057  Recovery completed for all targets.
    ggsevt:
    2013-09-29 18:38:08  INFO    OGG-00963  Oracle GoldenGate Manager for MySQL, mgr.prm:  Command received from GGSCI on host localhost (START EXTRACT EXT_M1 ).
    2013-09-29 18:38:08  INFO    OGG-00975  Oracle GoldenGate Manager for MySQL, mgr.prm:  EXTRACT EXT_M1 starting.
    2013-09-29 18:38:08  INFO    OGG-00992  Oracle GoldenGate Capture for MySQL, ext_m1.prm:  EXTRACT EXT_M1 starting.
    2013-09-29 18:38:08  INFO    OGG-03035  Oracle GoldenGate Capture for MySQL, ext_m1.prm:  Operating system character set identified as UTF-8. Locale: zh_CN, LC_ALL:.
    2013-09-29 18:38:08  INFO    OGG-00893  Oracle GoldenGate Capture for MySQL, ext_m1.prm:  SQL statement executed successfully.
    2013-09-29 18:38:08  INFO    OGG-01815  Oracle GoldenGate Capture for MySQL, ext_m1.prm:  Virtual Memory Facilities for: COM
        anon alloc: mmap(MAP_ANON)  anon free: munmap
        file alloc: mmap(MAP_SHARED)  file free: munmap
        target directories:
        /home/goldengate/dirtmp.
    2013-09-29 18:38:08  INFO    OGG-00993  Oracle GoldenGate Capture for MySQL, ext_m1.prm:  EXTRACT EXT_M1 started.
    2013-09-29 18:38:08  INFO    OGG-01056  Oracle GoldenGate Capture for MySQL, ext_m1.prm:  Recovery initialization completed for target file dirdat/m1000000, at RBA 1295, CSN 000086|000000065228677.
    2013-09-29 18:38:08  INFO    OGG-01478  Oracle GoldenGate Capture for MySQL, ext_m1.prm:  Output file dirdat/m1 is using format RELEASE 11.2.
    2013-09-29 18:38:08  INFO    OGG-01026  Oracle GoldenGate Capture for MySQL, ext_m1.prm:  Rolling over remote file dirdat/m1000000.
    2013-09-29 18:38:08  INFO    OGG-00182  Oracle GoldenGate Capture for MySQL, ext_m1.prm:  VAM API running in single-threaded mode.
    2013-09-29 18:38:08  INFO    OGG-01515  Oracle GoldenGate Capture for MySQL, ext_m1.prm:  Positioning to begin time 2013-9-29 06:26:18.
    2013-09-29 18:38:08  INFO    OGG-01516  Oracle GoldenGate Capture for MySQL, ext_m1.prm:  Positioned to Log Number: 86
        Record Offset: 65223906, 2013-9-29 06:26:18.
    2013-09-29 18:38:08  INFO    OGG-01517  Oracle GoldenGate Capture for MySQL, ext_m1.prm:  Position of first record processed Log Number: 86
        Record Offset: 65223906, 2013-9-29 06:26:18.
    2013-09-29 18:38:08  INFO    OGG-01054  Oracle GoldenGate Capture for MySQL, ext_m1.prm:  Recovery completed for target file dirdat/m1000001, at RBA 1316, CSN 000086|000000065228677.
    2013-09-29 18:38:08  INFO    OGG-01057  Oracle GoldenGate Capture for MySQL, ext_m1.prm:  Recovery completed for all targets.
    2013-09-29 18:38:09  INFO    OGG-01054  Oracle GoldenGate Capture for MySQL, dprd_m1.prm:  Recovery completed for target file /home/oracle/goldengate/dirdat/m1000002, at RBA 1435, CSN 000086|000000055512672.
    2013-09-29 18:38:09  INFO    OGG-01057  Oracle GoldenGate Capture for MySQL, dprd_m1.prm:  Recovery completed for all targets.

    GGSCI>>info ext_m1 showch
    EXTRACT    EXT_M1    Last Started 2013-09-29 18:38   Status ABENDED
    Checkpoint Lag       00:11:49 (updated 00:12:05 ago)
    VAM Read Checkpoint  2013-09-29 18:26:18.665841
    Current Checkpoint Detail:
    Read Checkpoint #1
      VAM External Interface
      Startup Checkpoint (starting position in the data source):
        Timestamp: 2013-09-29 18:26:18.665841
      Recovery Checkpoint (position of oldest unprocessed transaction in the data source):
        Timestamp: 2013-09-29 18:26:18.665841
      Current Checkpoint (position of last record read in the data source):
        Timestamp: 2013-09-29 18:26:18.665841
    Write Checkpoint #1
      GGS Log Trail
      Current Checkpoint (current write position):
        Sequence #: 0
        RBA: 917
        Timestamp: 2013-09-29 18:30:55.655570
        Extract Trail: dirdat/m1
    CSN state information:
      CRC: 20-82-1D-34
      CSN: Not available
    Header:
      Version = 2
      Record Source = A
      Type = 8
      # Input Checkpoints = 1
      # Output Checkpoints = 1
    File Information:
      Block Size = 2048
      Max Blocks = 100
      Record Length = 20480
      Current Offset = 0
    Configuration:
      Data Source = 5
      Transaction Integrity = 1
      Task Type = 0
    Status:
      Start Time = 2013-09-29 18:38:08
      Last Update Time = 2013-09-29 18:38:08
      Stop Status = A
      Last Result = 0

  • What are the settings for datasource and infopackage for flat file loading

    hI
    Im trying to load the data from flat file to DSO . can anyone tel me what are the settings for datasource and infopackage for flat file loading .
    pls let me know
    regards
    kumar

    Loading of transaction data in BI 7.0:step by step guide on how to load data from a flatfile into the BI 7 system
    Uploading of Transaction data
    Log on to your SAP
    Transaction code RSA1—LEAD YOU TO MODELLING
    1. Creation of Info Objects
    • In left panel select info object
    • Create info area
    • Create info object catalog ( characteristics & Key figures ) by right clicking the created info area
    • Create new characteristics and key figures under respective catalogs according to the project requirement
    • Create required info objects and Activate.
    2. Creation of Data Source
    • In the left panel select data sources
    • Create application component(AC)
    • Right click AC and create datasource
    • Specify data source name, source system, and data type ( Transaction data )
    • In general tab give short, medium, and long description.
    • In extraction tab specify file path, header rows to be ignored, data format(csv) and data separator( , )
    • In proposal tab load example data and verify it.
    • In field tab you can you can give the technical name of info objects in the template and you not have to map during the transformation the server will automatically map accordingly. If you are not mapping in this field tab you have to manually map during the transformation in Info providers.
    • Activate data source and read preview data under preview tab.
    • Create info package by right clicking data source and in schedule tab click star to load data to PSA.( make sure to close the flat file during loading )
    3. Creation of data targets
    • In left panel select info provider
    • Select created info area and right click to create ODS( Data store object ) or Cube.
    • Specify name fro the ODS or cube and click create
    • From the template window select the required characteristics and key figures and drag and drop it into the DATA FIELD and KEY FIELDS
    • Click Activate.
    • Right click on ODS or Cube and select create transformation.
    • In source of transformation , select object type( data source) and specify its name and source system Note: Source system will be a temporary folder or package into which data is getting stored
    • Activate created transformation
    • Create Data transfer process (DTP) by right clicking the master data attributes
    • In extraction tab specify extraction mode ( full)
    • In update tab specify error handling ( request green)
    • Activate DTP and in execute tab click execute button to load data in data targets.
    4. Monitor
    Right Click data targets and select manage and in contents tab select contents to view the loaded data. There are two tables in ODS new table and active table to load data from new table to active table you have to activate after selecting the loaded data . Alternatively monitor icon can be used.
    Loading of master data in BI 7.0:
    For Uploading of master data in BI 7.0
    Log on to your SAP
    Transaction code RSA1—LEAD YOU TO MODELLING
    1. Creation of Info Objects
    • In left panel select info object
    • Create info area
    • Create info object catalog ( characteristics & Key figures ) by right clicking the created info area
    • Create new characteristics and key figures under respective catalogs according to the project requirement
    • Create required info objects and Activate.
    2. Creation of Data Source
    • In the left panel select data sources
    • Create application component(AC)
    • Right click AC and create datasource
    • Specify data source name, source system, and data type ( master data attributes, text, hierarchies)
    • In general tab give short, medium, and long description.
    • In extraction tab specify file path, header rows to be ignored, data format(csv) and data separator( , )
    • In proposal tab load example data and verify it.
    • In field tab you can you can give the technical name of info objects in the template and you not have to map during the transformation the server will automatically map accordingly. If you are not mapping in this field tab you have to manually map during the transformation in Info providers.
    • Activate data source and read preview data under preview tab.
    • Create info package by right clicking data source and in schedule tab click star to load data to PSA.( make sure to close the flat file during loading )
    3. Creation of data targets
    • In left panel select info provider
    • Select created info area and right click to select Insert Characteristics as info provider
    • Select required info object ( Ex : Employee ID)
    • Under that info object select attributes
    • Right click on attributes and select create transformation.
    • In source of transformation , select object type( data source) and specify its name and source system Note: Source system will be a temporary folder or package into which data is getting stored
    • Activate created transformation
    • Create Data transfer process (DTP) by right clicking the master data attributes
    • In extraction tab specify extraction mode ( full)
    • In update tab specify error handling ( request green)
    • Activate DTP and in execute tab click execute button to load data in data targets.

  • How IE works for  flat file

    Hi all:
         As we all know that, when IE gets a idoc's service name from SLD, then use it with idoc's message type and Idoc type to do receiver determniation, what about flat file ? how can we know its Service name and interface name  if there is only a flat file on FTP?  how IE works for Flat file ?
         Couldn't thank you more

    Hi,
    For any idoc scenarious, you would use business systems rather than business service which is stored in SLD. So the IE would fetch it from SLD at runtime.
    For file based scenarious also, you can create business system as type third party and use the same.
    Is that answer your question?
    Regards
    Krish

  • When we go for Flat File creatin

    Hi gurus,
              When we go for flat file creation.Can any one give me one Scenario with example.
    Thanks in advance

    Hello Srikar,
    How r u ?
    We go for flat file in some scenarios like,
    1. The source system could not be connected to the BW System (here u could generate a FlatFile)
    2. If the Customer asks for a different kind of report (in our case we do FlatFile Creation for a complex time management - Employee Working Time is 1st merged with R/3 data, then we combine both together as 1 FlatFile)
    3. Sometimes the customer will give the FlatFile data generated by them
    4. We also take FlatFile reports from Data Targers, using InfoSpokes
    More scenarios are there,
    Best Regards....
    Sankar Kumar
    +91 98403 47141

  • Selection in IP for flat file Load

    Hi Experts ,
    I Want to know can I have selections available for an IP which is created for Flat file as data source.
    your early response is highly appreciated.
    Regards
    Patil

    Hi,
       Yes You can. But at the data source level you have to tick the selection fields, so that at info pack level those fields will be available for selections.
    Regards
    Sankar

  • Delta Upload for Flat File

    Hello Everyone
    i am srikanth. i would like to know wheather do we have a facility for Delta Upload for flat file. If yes can u please give me the steps.
    thanks in advance
    srikanth

    Hi Sabrina...thank you for ur help...i did load the data from cube to ods
    steps.
    1. i generated export data source on the cube
    2. i found the name of the cube with prefix 8<infocube name> in the infosource under DM application component.
    3. there are already communication structure and transfer rules activated but when i am creating update rules for the ods..i am getting the message 0recordmode missing in the infosource.
    4. so i went to infosource and added 0recordmode in communication structure and activated but the transfer rules in yellow colour..there was no object assigned to 0recordmode but still i activated.
    5.again i went to ods and created update rules and activated (tis time i didnt get any message about 0recordmode).
    6.i created infopackage and loaded.
    a)Now my question is without green signal in the transfer rule how data populated into the ods and in your answer you mentioned to create communication structure and transfer rules where in i didnt do anything.
    b) will i b facing any problem if i keep loading the data into the ods from cube (in yellow signal) ..is it a correct procedure..plz correct me..thanks in advance

  • Best practise around handling time dependency for flat file loads

    Hi folks,
    This is a fairly common situation - handling time dependency for flat file loads. Please can anyone share their experience around handling this. One common approach is to handle the time validity changes within the flat file where it is easily changeable by the user but then again is prone to input errors by the user. Another would be to handle this via a DSO. Possibly, also have this data entered directly in BI using IP planning layouts. There is a IP planning function that allows for loading flat file data but then again, it only works without the time dependency factor.
    It would be great to hear thoughts or if anyone can point to a best practise document for such a scenario.
    Thanks.

    Bump!

  • Oracle OWB integrator for Flat Files 3.0

    Hi,
    I have installed owb 11g r2 in my machine, but while creating flat files module for file import it is showing that
    it does not find *"Oracle OWB integrator for Flat Files 3.0"*. so i cannot use "import metadata using flat file wizard".
    do i need to install the owb again or the Oracle OWB integrator for Flat Files 3.0 alone..or how do i integrate it with owb....... please help if anyone can,,
    Thanks..

    Here is the certification matrix
    https://www.google.co.in/url?sa=t&rct=j&q=&esrc=s&source=web&cd=5&ved=0CFEQFjAE&url=http%3A%2F%2Fwww.oracle.com%2Ftechnetwork%2Fmiddleware%2Fdata-integration%2Fodi-11gr1certmatrix-ps6-1928216.xls&ei=mtmUUcX7DoiJrQek04DwAg&usg=AFQjCNGoOUFQHdK7Ti2DLb6vz_3s-miP3A&sig2=q3rf2foe9bl4_WbsLPwWng&bvm=bv.46471029,d.bmk

  • Custom code for Flat file reconciliation on LDAP

    Hello,
    I have to write a custom code for flat file reconciliation on LDAP as the GTC connector wasn't working entirely.
    Could someone help me out with this.. How do i do this ??
    Thanks

    flat file reconciliation on LDAPWhat do you mean by Flat File on LDAP ?
    If you want to create Flat File connector then search google for reading a flat file using Java.
    Define RO Fields and do mapping in Process Defintion. You can use Xellerate User RO for Trusted Recon.
    Make a map of CSV that and Recon Field
    Call the Reconciliation API

  • Omit the Open Hub control file 'S_*' for flat file extracts

    Hi Folks,
    a quick question is it somehow possible to omit the control file generation for flat file extracts.
    We got some unix scripts running that get confused by the S_* files and we where wondering if we can switch the creation of those files off.
    Thanks and best regards,
    Axel

    Hi,
    However, the S_ (structure) file does not reflect the proper structure. The S_ file lists all fields sorted in alphabetical order based on the field name, instead of listing all fields in the correct order that they appear in the output file.
    As I know and checked this is not the case. The S_ file has the fields in the order of infospoke object sequence ( and in transformation tab, source structure).
    I would suggest you to control it again.
    Derya

  • Golden gate for sybase

    I have Sybase aSE source and want to use the GOLDEN gate replication to capture the change and put it to flat file or Netezza.
    Can GOLDEN gate would be able to do this.
    Thanks

    Yes you can. See http://docs.oracle.com/cd/E14571_01/integrate.1111/e12644/ogg.htm for more details. For flatfile, use the GoldenGate extract parameter FORMATASCII. There are also other flat file formats, FORMATSQL,FORMATXML.

  • Transfer structure sequence for Flat File

    I am wondering is it realy important to maintain particular sequence in DS/TR str. for InfoObjects which are set up for getting constant value in TR rules. I am having trouble loading flat file. TR str. has some compounded characteristics like 0Fisvar for 0Fiscper and 0Co_area for 0profit_ctr both getting fix value in transfer rules. how to maintain flat file and TR str. sequence?

    Hi Vishan,
    This is what I used to do.
    If there is a compounded InfoObject or Key Figure that requires a unit (even though in your case, the compounded InfoCObject & the unita are always the same), You "need" add them in TS.
    After adding them, this is what I will do:
    Move them around to match the Flat File structure.
    Now, the fields that are always constants & not coming from Flat Files, move them to the end.
    They will be ignored. Your TR will populate them.
    If you get an error now, that means your Flat File is not formatted correctly, apart from that, you are fine.
    Ram Chamarthy
    Message was edited by: Ram Chamarthy

  • How to create security for flat files under a Webserver Directory?

    Hello,
    I have an Application where I need to show a Flat Excel File (with open/save option) from a URL from Portal.
    This works fine.
    But a user can view this file without login to Portal.
    Though we restricted the directory, file access is uncontrolled.
    What should we do?
    Thanks
    Madhav

    I mean, how to enforce Login to Portal on flat files in the Web Server Directory(if the users are accessing directly from URL), they should be accessible as long as they logged into Oracle Portal. But right now, they can access by typing the URL for this flat file directly into browser.
    Thanks for your time
    Madhav

  • Selecting fields for flat file input

    I've built a scenario that uses the File adapter to load a flat file into SAP and fully understand the concepts. My problem is that I have fixed length file I need to load which has 50+ fields per row. I only need 10 fields per row and those fields are spread throughout the row. Using fieldFixedlengths I specify the length of each field - and it appears they must be consecutive fields. How can I specify the fields that I need.
    For example my input file may be as follows
    AAAAAAABBBBBBBCCCCCCCDDDDDDEEEEEEE
    I only need AAAAAAA CCCCCCC and EEEEEEE. Is there a way to do this?

    Hi Robert,
    I don't know why you want to upload all junk data there apart from importing only relevent fields. if i am concered to do so, i'll do this in outside the XI BOX. If that file is in come fixed format or lets say its in csv format, just open that file into excel and then try to remove the row you don't want and then save it again to the same file format. and then import and map inside XI it will be simple.
    Also if you don't wanna to do this, then tell us what logic you wanna use to find which rows you don't want. I don't think if you are not sure or you already know what rows/fields u want.
    Regards
    Aashish Sinha
    PS : reward points if helpful

Maybe you are looking for

  • BlackBerry Internet Service with Microsoft Outlook Web Access account

    I need to switch my 8820 to a new work account.  They use Outlook Exchange 2003, but do not support BES.  I have access via both regular desktop Outlook and also Outlook Web Access (from a different server).   I read the procedure to use BIS in Knowl

  • App store download error *****...:@

    I keep downloading mac os x lion upgrade in app store and when the download reaches above 2GB it says "errorr has occured". I've got a low interet connection and so it takes 12 hours for me to download 2GB and each time I get this error and I've down

  • Don't redeploy the whole portal application each time we change a portlet

    Hi folks, I have around 50 struts portlets in my weblogic portal, following the struts portlet architecture in the documentation, so we have now one ear which ends up that whenever we change any portlet we will have to redeploy the whole portal :( !!

  • Customized Help does not work anymore after patch update.

    Hi, Our ECC system have a customized help (SAP Main Menu -> Help -> Company Help).  It stoped working after the kernel upgrade.  What do you think made it stop working. I tried to update also the sapgui patch to 21 but still the help would not work.

  • GOS - Start Workflow only if not already started

    Hello friends, Have a workflow that is started by an event. Now, there is a need to allow the users to start the same Workflow from GOS as well. Is there any way to prevent the GOS from starting the Workflow if an incomplete workflow instance exists