Problem Creating DataStore in DS with ECC 6.0 as datasource

Dear experts,
I was unable to create this DataStore on DataServices on this Windows 2003 SP2 machine. My source system is ECC 6.0. I get "Cannot establish database connection. .... Please make sure SAP server is running and the login information is correct. (BODI-1111348).
I have provided the server name (192.168.xxx.xxx), username, password and in Advanced, I supplied Language as EN, Client Number as 800 and System NUmber as 00.
I was wondering do I need to install anything related to ECC on this windows machine? The ECC is also on Windows NT machine.

Here are some things to check:
- Verify that the SAP app server IP address, username, password, client and SID are correct
- Try to ping the SAP server IP address
- Try to ping the SAP host DNS name.  You may need to use the complete DNS name, i.e., myserver.mycompany.com
- In the datastore, try replacing the SAP host's IP address with the DNS name
- If this a dialog account that you use to log into SAP GUI then your SAP system may require a non-dialog account for service connections
These topics are discussed in the Technical Manual, Supplement for SAP
- Did you install the DS functions in SAP?
- Have you set up your SAP user authorizations?
This link may be useful
http://wiki.sdn.sap.com/wiki/display/BOBJ/ConnectingtoSAP

Similar Messages

  • Problem creating an sql query with a parameter which is a list

    Hi,
    Im having a problem creating a certain SQL query.
    The query looks like this:
    SELECT gstock_id FROM germplasm_stock gps, germplasm gp WHERE gps.germplasm_id = gp.germplasm_id AND organism_id IN ($childList:VARCHAR).
    the organism_id field is of DECIMAL type.
    the parameter childList is actually a list of Id's, something like: 123,124,789
    and it is created dynamically by an other function, so I cant just put it there staticlly.
    I tried using the ARRAY type instead of VARCHAR, but that didn't work,
    anyone knows how can I give this query a parameter which is a list of numbers ?
    Thanks

    I have tried all the following options and the same issue occurs:
    EXEC dbo.uspGetSiteChanges @ChangeVersion = ?
    With Parameter: 0, @ChangeVersion, ChangeVersion
    EXEC dbo.uspGetSiteChanges ?
    With Parameter: 0, @ChangeVersion, ChangeVersion
    In my first data flow I use the following and it works on two OLE DB Sources:
    EXEC dbo.uspGetSiteChanges @ChangeVersion = ?
    With:
    In my second data flow task, I use the same command and parameter mappings and it fails, very strange.

  • Problem creating an OData service with RFC-/BOR Import.

    Hi Experts,
    i am running always into the same problem
    I want to create an Odata Channel with "SEGW" and Datamodel with an RFC import.
    RFC si GET_CPU_ALL
    but I am running always into the same problem when I want to generate the runtime.
    I don´t know why.
    Error: ... _DPC was generated with syntax errors....
    I would really appreciate if someone could help me.
    Thanks

    Oh dear, your'e not an ABAP guy are you
    OK, you can go back to my previous suggestion and do it a different way, but first...
    TF_CPU_ALL isn't a table, it's a structure of type CPU_ALL. GET_CPU_ALL uses that as the interface structure for data it sends out.
    Starting with a blank service, create a model by using DDIC import wizard, using CPU_ALL as the structure. Call the entity "CPU". Mark 'serialnr' as the key.Create an entityset called "CPUSet".
    Generate the service.
    Go to DPC_EXT, find method CPUSET_GET_ENTITYSET and redefine it. Assuming that you can leave the inputs as the defaults, call function GET_CPU_ALL in the method with the importing parameter TF_CPU_ALL assigned to ET_ENTITYSET.
    call function 'GET_CPU_ALL'
         tables
           tf_cpu_all = et_entityset
         exceptions
           others     = 1.
       if sy-subrc <> 0.
    * Implement suitable error handling here
       endif.
    That's all it needs.
    Activate EXT class and try your service.
    opu/odata/sap/ZCPUALL_SRV/CPUSet?$format=json
    "type" : "ZCPUALL_SRV.CPU"
             "Type" : 101,
             "Subtype" : 3,
             "Serialnr" : 0,
             "NbrCpu" : 2,
             "Load1Avg" : 7,
             "Load5Avg" : 29,
             "Load15Avg" : 30,
             "IntSec" : 1059,
             "SyscSec" : 4555,
             "CsSec" : 895,
             "UsrTotal" : 3,
             "SysTotal" : 1,
             "IdleTotal" : 96,
             "IdleTrue" : 96,
             "WaitTrue" : 0
    Did it myself in under 10 minutes from build to test.

  • Problems creating the Master Repository with MS SQL Server 2000

    Hello guys!
    I can't create the Master Repository with MS SQL Server 2000 database.
    Wath is the correct adress in the URL?
    I select the Driver: com.microsoft.jdbc.sqlserver.SQLServerDriver and the URL: *jdbc:microsoft:sqlserver://<host>:<port>;SelectMethod=cursor[;<property>=<value>...]*
    Thanks
    Maurício
    Edited by: user857262 on 03/10/2008 10:05

    Hi Maurício,
    For MS SQL Server the following drivers should avaliable in /drivers folder (http://www.inetsoftware.de/),
    * msutil.jar
    * mssqlserver.jar
    * msbase.jar
    JDBC Driver is:
    com.microsoft.jdbc.sqlserver.SQLServerDriver
    JDBC URL is:
    jdbc:sqlserver://serverName\instance:port;property=value[;property=value]
    Example,
    jdbc:sqlserver://myHost:1433;selectMethod=cursor;databaseName=myDB
    Thanks,
    G

  • Having problems creating PDF from website with query-string URLs

    I have a website that I would like to create a PDF from. I am using the Create -> PDF from Web Page..., selecting the site's home page, and capturing 2 levels, with "stay on same path" and "stay on same server" checked in order to limit the scope of the crawl.
    Where the pages are at example.com/foo/ and example.com/foo/bar/, this works fine. However, where the pages are at example.com/foo/ and example.com/foo/?p=1, the page represented by the query string URL is not converted to the PDF.
    This is a problem, given that the site I want to archive as a PDF uses query strings for most of its pages.
    I have been able to individually convert a single query-string-based page into a PDF using this method, but doing this for every page on the site would be almost impossible given the sheer number of pages on the site.
    Is this a known issue? Is there a workaround other than separately capturing each page (which would be prohibitive effort)?
    I have tried this in both Acrobat Pro X and Acrobat Pro 9 for Mac, with the same results.

    Remember, Acrobat is a 32-bit application and as such cannot access all that 'extra' stuff.
    Be well...

  • Problems creating an spatial index with srid=4326

    Hi!
    I would like to know if somebody can help me with the following problem: We are using the 10.2.0.1 version and we need that our SRID value is 4326. We do not have problems with 8307 or another value. However, when we tried to use srid = 4326, appears the following error message:
    Error on line 17 CREATE INDEX SIDX_D3M_SDO_GEOMETRY ON DAT_3DM_MODEL (DM3_SDO_GEOMETRY) INDEXTYPE ORA-29855: an error in the execution of routine ODCIINDEXCREATE has taken place ORA-13249: internal error in Spatial index: [mdidxrbd] ORA-13249: Error initializing geodetic transform ORA-06512: in “MDSYS.SDO_INDEX_METHOD_10I”, line 10
    The PL/SQL that we used is the following one:
    DELETE FROM USER_SDO_GEOM_METADATA WHERE TABLE_NAME = “DAT_3DM_MODEL”
    COMMIT
    INSERT INTO USER_SDO_GEOM_METADATA (TABLE_NAME, COLUMN_NAME, DIMINFO, SRID) VALUES('DAT_3DM_MODEL', 'DM3_SDO_GEOMETRY', MDSYS.SDO_DIM_ARRAY ( MDSYS.SDO_DIM_ELEMENT ('LONGITUDE', -180, 180, 0,05), -- MDSYS.SDO_DIM_ELEMENT ('LATITUDE', -90, 90, 0.05) ), 4326 )
    CREATE INDEX SIDX_D3M_SDO_GEOMETRY ON DAT_3DM_MODEL (DM3_SDO_GEOMETRY) INDEXTYPE IS MDSYS.SPATIAL_INDEX
    Thanks in advance,
    Susana.

    I cannot reproduce the error, in my environment. However:
    Your insert statement into USER_SDO_GEOM_METADATA appears to have included some typos. They might have happened, when transcribing. Please make sure you use the following:
    INSERT INTO USER_SDO_GEOM_METADATA (
    TABLE_NAME,
    COLUMN_NAME,
    DIMINFO,
    SRID)
    VALUES(
    'DAT_3DM_MODEL',
    'DM3_SDO_GEOMETRY',
    MDSYS.SDO_DIM_ARRAY (
    MDSYS.SDO_DIM_ELEMENT ('LONGITUDE', -180, 180, 10),
    MDSYS.SDO_DIM_ELEMENT ('LATITUDE', -90, 90, 10)),
    4326);
    However, the actual culprit is most certainly different. As I suspected, it might be related to the "decimal comma": In Germany, for instance, a decimal comma is used, instead of a decimal point. You have used a decimal comma in your original INSERT, as well (0,05 instead of 0.05).
    Please try the following:
    SQL> select wktext from cs_srs where srid = 4326;
    WKTEXT
    GEOGCS [ "WGS 84", DATUM ["World Geodetic System 1984 (EPSG ID 6326)", SPHEROID
    ["WGS 84 (EPSG ID 7030)", 6378137, 298.257223563]], PRIMEM [ "Greenwich", 0.0000
    00 ], UNIT ["Decimal Degree", 0.01745329251994328]]
    On your system, you will likely find a decimal comma, where my output has a decimal point. This is bug 5097326, which has been fixed, and backported to 10.2.0.3 and 10.2.0.4.

  • Problem creating an extendable panel with a header

    Hi -
    I'm trying to create a base panel class. What it needs is a header at the
    top of it with a title box(another panel) of a different color. The title of
    course is in the title box. My primitive solution was to create a parent
    panel and place the title box on the header panel, then the header on
    the parent panel. My problem is that I can't extend from that and add
    components to it with the header there.
    Does anybody have a more elegant solution?
    Thanks.

    Hello Steven,
    You can use createdby field to determine the docentry of the base document instead of quering docentry by docnum
    select createdby, transtype, baseref from jdt1
    Note: Always set the correct document type for payments: (createdby is ok in integer format)
    Example:
    Opening Balance:  JE # 8 amount 1000, created by Opening balance (not JE)
       vPay.Invoices.DocEntry = 8
       vPay.Invoices.InvoiceType = BoRcptInvTypes.it_OpeningBalance
       vPay.Invoices.SumApplied = 1000
       Call vPay.Invoices.Add
    Opening Balance:  JE # 8 amount 1000, created by Journal Entry
       vPay.Invoices.DocEntry = 8
       vPay.Invoices.InvoiceType = BoRcptInvTypes.it_JournalEntry
       vPay.Invoices.SumApplied = 1000
       Call vPay.Invoices.Add
    Sales Invoice:  DocEntry: 11  amount 1000
       vPay.Invoices.DocEntry = 11
       vPay.Invoices.InvoiceType = BoRcptInvTypes.it_Invoice
       vPay.Invoices.SumApplied = 1000
       Call vPay.Invoices.Add
    Regards,
    J.

  • Problems creating Blu Ray Discs with AME Presets in CS4

    After several weeks wasted (months back) with an ATI 5750 graphics card in Win 7/and CS4 (not working correctly) - I finally resolved by swapping with a NVidia GTS250 card .....but all my troubles are not over yet ....
    I'm running Premiere CS4 and have finally just got down to testing creation of a Blu Ray test disc from some AVCHD 1920x1080 HD footage I am editing from my new camera.
    My work in CS4 is just testing at present as I used to use CS3 [still have on my dual core machine with Matrox RTX2]  - and have successfully created Blu Ray discs from HDV material in CS3 - but using Blu Ray MPEG 2 settings.
    New machine Win 7 - 64 bit
    Premiere CS4 is at 4-2-0 version
    Encore at 4-0-1 version
    These are the last updates
    I've selected a short test piece (about 2 mins)on Premiere timeline and initially tried Media export using Adobe Media encoder.
    I selected H264 BluRay codec in media encoder. Hi quality 1080 25 fps.(PAL)
    This creates dual MP4 Video and Audio streams
    Encode process goes OK, just slightly slower than real time, (I'm using i7-930 Quad Core and Win 7 64 Bit)
    Now - when I come to input the encoded video file into Encore as an Asset - Encore freezes and the whole machine locks up !!!
    Audio file imports OK but the Encore software does not like the encoded Video file
    I've re-installed Premiere and Encore twice in case of bad install. It wasn't that.
    I also found Adobe Media player couldn't play the file created by AME either - but it would play on a Premiere timeline?
    So is Adobe Media encoder messing up here ?
    So - .....3 hours of testing later ....I stumble on a Support thread from someone on the Forum that they found AME preset has "HIGH" in the Encoding Profile Preset - when it should be MAIN ?.
    Not sure what these settings do - but tried changing that parameter and re-encoded the Video file with H264 BluRay setting - all to no avail as when I try importing again the machne just locks up ?
    My final search in Premiere help file about creating BluRay discs ONLY talks about using MPEG 2 BluRay setting for creating BluRay discs
    It does not specify H264. Why is that?
    Can one of the moderators / experts confirm if H264 Blu ray format  is supported for BluRay disc creation in CS4 please??
    Better still please tell me the exact settings for Adobe Media encoder so it will import into Encore without having to be transcoded again.
    H264 should work - yes ?
    Final comment
    As an alternative method, I tried sending the short timeline in Premiere (using Dynamic Link)  to Encore to test that route for making a BluRay disc
    Bearing in mind the sequence is only about 2 mins long - when I ty to Transcode using the engine within Encore - the process takes a ridiculously long time and twice it has crashed just before the end of the process......
    Can someone explain why the encode process in Encore takes such a long time as I thought it used the Adobe Media encoder engine as well.?
    Adobe Media Encoder is much faster used stand alone mode.
    Encore seems to not use that ?
    Anyway - conclusion is that using Encore to encode the tmeline is a non starter as it takes too long.
    why so slow  on such a fast machine?....I know HD encoding is demanding but not that demanding with a decent machine.
    So the question is ....
    How do I encode in the most efficient and fastest way possible (using H264)  for Blu Ray authoring into Encore without need for more Transcodes?
    Any help or suggestions welcome as I'm now hitting my head against the wall :-)
    I have not gone through Adobe Support desk as it is usually a waste of time.{Dear VP of Adobe Customer Service - please do get this sorted out in the near future  if you want to retain loyal customers....}
    I cannot believe how disappointing  CS4 is turning out to be.
    After investing in a very fast PC machine etc the whole user experience is somewhat of a let down.
    Why so many issues for such a small test??

    Picking up on my original post about H264 - as I have a dual boot machine I also have a clean  installed Win 7 /64bit partition which I use for general bits of work.
    On that I have CyberPowerDVD SW installed which plays all Video files fine including BluRay material etc.
    Interestingly when I try to play the file that AME has created from the BluRay H264 preset it says 'It cannot play the file as it is a format it does not recognise...'
    This is really weird and would suggest AME is not encoding the file corrrectly??
    I have already removed and made a re-install of CS4 on my main [video editing] partition but there were some fragments of CS4 left intact
    How do I completely remove everything?
    I tried the CS4 clean script but it said it relied on Microsoft CLean Up utility and gave a link
    When I loooked on Microsoft web site the link to download it had been removed?
    Also Adobe say the CS4 remove script is no longer supported ?
    Can anyone advise on the correct tool/procedure to completely remove CS4 from a Win 7/64bit installation so I can then try re-install of CS4 one more time....in case the installation is corrupt in some way
    I can't explain why I am having this problem otherwise if other people find AME encodes OK with H264 BluRay preset.

  • Problems creating Shutterfly Photo Book with PSE 9

    Has anyone run across this problem. Searched the forum here, and nothing has come up. Created a photo book using the Elements 9 organizer. Selected the order  button, it saved the files, transferred them, then opened a Shutterfly  window to complete the order. Single Image that Shutterfly would show  was the cover to center the title page image in a center cut space on  the cover. The image looked compressed and not centered. Exited the  program, re centered the image and selected order again. Same result,  image looked centered but compressed. Went to the Shutterfly website in make sure that images were not  compressed like that (since it saves your project under your profile).  Instead found that some pages were correct, and some (like the title  page) took an entire page that I made in Elements and compressed it into  a single frame of a three framed picture page, for example. The whole  project is a total mess! I'm just glad I looked on the Shutterfly site  before pressing the purchase button and paying $30 for a book to be  delivered looking like that! Does anyone know what I am doing wrong? I pretty much used all of the  prompts in elements to create the book, and since Shutterfly is supposed  to be seamlessly integrated into this, I thought it would work without  problems. Please let me know what I am doing wrong. Thanks,
    Dave

    Have seached this entire forum for help with this issues and there is nothing. Can anyone see what I am doing wrong??
    Thanks!

  • Problem creating an outgoing payment with DI API

    Hi
    I've created a form that allows a user to select a BP and automatically reconcile the accounts for this BP by creating an Outgoing Payment. It all works fine for Invoices, Credits, Purchase Invoices etc., however if the account balance contains Incoming or outgoing payments I am not sure how to treat these as there is not an invoice type (SAPbobsCOM.BoRcptInvTypes) for these. For the other types of document I simply declare what type of invoice the document to be added to the outgoing payment is by looking at the Transtype field of the record from the JDT1 table. I then use the JDT1.BaseRef field to retrieve the document's DocNum which I use to return the DocEntry by looking in the relevant table e.g. for invoices - OINV. All I then need to do is set the OutgoingPayments.Invoices.DocEntry value to this DocEntry and Add the line.
    I also have the same problem with Opening Balances as I can't seem to figure out what to set the DocEntry value to as this seems to be stored as a Journal Entry which uses the TransId for it's primary key, however this does not seem to work.
    Thanks a lot.
    Steve

    Hello Steven,
    You can use createdby field to determine the docentry of the base document instead of quering docentry by docnum
    select createdby, transtype, baseref from jdt1
    Note: Always set the correct document type for payments: (createdby is ok in integer format)
    Example:
    Opening Balance:  JE # 8 amount 1000, created by Opening balance (not JE)
       vPay.Invoices.DocEntry = 8
       vPay.Invoices.InvoiceType = BoRcptInvTypes.it_OpeningBalance
       vPay.Invoices.SumApplied = 1000
       Call vPay.Invoices.Add
    Opening Balance:  JE # 8 amount 1000, created by Journal Entry
       vPay.Invoices.DocEntry = 8
       vPay.Invoices.InvoiceType = BoRcptInvTypes.it_JournalEntry
       vPay.Invoices.SumApplied = 1000
       Call vPay.Invoices.Add
    Sales Invoice:  DocEntry: 11  amount 1000
       vPay.Invoices.DocEntry = 11
       vPay.Invoices.InvoiceType = BoRcptInvTypes.it_Invoice
       vPay.Invoices.SumApplied = 1000
       Call vPay.Invoices.Add
    Regards,
    J.

  • Problem creating A/R Invoice with Withholding tax data via DI-Server

    Hi!
    Using the following SOAP Request to the DI-Server, I wanted to create an A/R Invoice having a withholding tax data but it always respond with an error saying
    Total taxable amount of all rows exceeds the base amount  [INV5.TaxbleAmnt][line: 1]
    SOAP Request:
    <?xml version="1.0" encoding="UTF-16"?>
    <env:Envelope xmlns:env="http://www.w3.org/2003/05/soap-envelope">
      <env:Header>
        <SessionID>203A3C01-7808-4638-8322-2307DF3C0F8F</SessionID>
      </env:Header>
      <env:Body>
        <dis:Add xmlns:dis="http://www.sap.com/SBO/DIS">
          <Service>InvoicesService</Service>
          <Document>
            <DocType>dDocument_Items</DocType>
            <HandWritten>tNO</HandWritten>
            <DocDate>2010-02-08</DocDate>
            <DocDueDate>2010-02-08</DocDueDate>
            <TaxDate>2010-02-08</TaxDate>
            <VatDate>2010-02-08</VatDate>
            <CardCode>NPI</CardCode>
            <Comments>test di-server soap message 1</Comments>
            <DocumentLines>
              <DocumentLine>
                <ItemCode>TRC</ItemCode>
                <Quantity>1</Quantity>
                <Price>1000</Price>
                <TaxCode>OVAT</TaxCode>
                <VatGroup>OVAT</VatGroup>
                <TaxLiable>tYES</TaxLiable>
                <WTLiable>tYES</WTLiable>
              </DocumentLine>
            </DocumentLines>
            <WithholdingTaxDataCollection>
              <WithholdingTaxData>
                <WTCode>C140</WTCode>
                <TaxableAmount>1000</TaxableAmount>
                <WTAmount>100</WTAmount>
              </WithholdingTaxData>
            </WithholdingTaxDataCollection>
          </Document>
        </dis:Add>
      </env:Body>
    </env:Envelope>
    OVAT rate above is 10%.
    The withholding tax code C140 is setup as
    rate=10,
    Category=Payment,
    Base Type=Net,
    % Base Amount = 100,
    Rounding Type = Commercial Values.
    We are using the New Zealand/Australia localization in SAP B1 2007A PL49.
    The above code can be successful only if I set the <WithholdingTaxDataCollection> node to:
            <WithholdingTaxDataCollection>
              <WithholdingTaxData>
                <WTCode>C140</WTCode>
                <TaxableAmount>0</TaxableAmount>
                <WTAmount>100</WTAmount>
              </WithholdingTaxData>
            </WithholdingTaxDataCollection>
    setting TaxableAmount equal to 0 which is not desired.
    Can anyone extend me some help, please?
    Thanks.

    Albert,
    Did you try adding this via the DI API and not the DI Server?  Do you get the same error?  Please see this SAP Note ...
    https://websmp130.sap-ag.de/sap(bD1lbiZjPTAwMQ==)/bc/bsp/spn/smb_searchnotes/display.htm?note_langu=E&note_numm=0001303019
    Eddy

  • Problem creating physical Standby database with RMAN

    Hi All
    I am trying to learn oracle dataguard and as part of the process learning creating standby database.
    Platform : Sun-Fire-V250 Sparc, Solaris 10
    Database Version - Oracle 11R2
    I am creating standby database on same server, so directory structure is different.
    Following the instructions on Oracle site I managed to create a functional physical standby database. But I am not able to create standby database using RMAN. These are the steps that I followed-
    1.Set up all necessary parameters on primary database as done while creating physical standby database manually, eg setting force logging, creating standby logs etc.
    2.Edited parameter file on primary database as done while creating manual pysical standby database creation. Some of the changes done are-
    On Primary Database:
    *.FAL_CLIENT='orcl11020' #Primary database unique name
    *.FAL_SERVER='stdby_11' #Standby database unique name
    db_file_name_convert='/<dir>/oradata/stdby_11','/<dir>/oradata/orcl11020'
    log_file_name_convert='/<dir>/oradata/stdby_11','/<dir>/oradata/orcl11020','/<dir>/oradata/stdby_11/redo_mem','/<dir>/oradata/orcl11020/redo_mem'
    standby_file_management=auto
    *.log_archive_config='DG_CONFIG=(orcl11020,stdby_11)'
    *.log_archive_dest_1='LOCATION=/<dir>/flash_recovery_area/ORCL11020/archivelog
    VALID_FOR=(ALL_LOGFILES,ALL_ROLES) db_unique_name=orcl11020'
    *.log_archive_dest_2='SERVICE=stdby_11 LGWR ASYNC VALID_FOR=(ONLINE_LOGFILES,PRIMARY_ROLE) db_unique_name=stdby_11'
    *.LOG_ARCHIVE_DEST_STATE_1='ENABLE'
    *.LOG_ARCHIVE_DEST_STATE_2='ENABLE'
    *.LOG_ARCHIVE_FORMAT='%t_%s_%r.arc'
    *.LOG_ARCHIVE_MAX_PROCESSES=30
    Copied same pfile for standby database and modified following-
    *.control_files='/<dir>/oradata/stdby_11/stdby_11.ctl','/<dir>/fra_stdby/stdby_11/stdby_11.ctl'
    *.db_name='orcl1102'
    *.db_unique_name='stdby_11'
    *.FAL_CLIENT='stdby_11'
    *.FAL_SERVER='orcl11020'
    db_file_name_convert='/<dir>/oradata/orcl11020','/<dir>/oradata/stdby_11'
    log_file_name_convert='/<dir>/oradata/orcl11020','/<dir>/oradata/stdby_11','/<dir>/oradata/orcl11020/redo_mem','/<dir>/oradata/stdby_11/redo_mem'
    standby_file_management=auto
    *.log_archive_dest_1='LOCATION=/<dir>/fra_stdby/STDBY_11/archivelog
    VALID_FOR=(ALL_LOGFILES,ALL_ROLES) db_unique_name=stdby_11'
    *.log_archive_dest_2='SERVICE=orcl11020 LGWR ASYNC VALID_FOR=(ONLINE_LOGFILES,PRIMARY_ROLE)
    db_unique_name=orcl11020'
    3. Add relevant information in tnsnames.ora and listener.ora files and then restart listener.
    3. Created password file with same credential as primary database.
    4.Up-to-date RMAN backup of primary database available.
    5.Create standby controlfile with rman
    While primary database s open (I tried with primary database in mount mode as well)-
    $>rman catalog rman/paswd@rman target /
    RMAN> BACKUP CURRENT CONTROLFILE FOR STANDBY;
    6. Open a new terminal and startup standby database in nomount mode using parameter file created -
    $>ORACLE_SID=stdby_11
    $>export ORACLE_SID
    $>sqlplus / as sysdba
    SQL>STARTUP NOMOUNT pfile='<location/initfilename.ora'
    SQL>quit
    $> rman AUXILIARY / target sys/passwd@orcl11020 catalog rman/passwd@rman
    RMAN>DUPLICATE TARGET DATABASE FOR STANDBY DORECOVER;
    RMAN finishes without error but archive logs are not being tranported. Looking at the log, following caught my eye-
    Error 1017 received logging on to the standby
    Check that the primary and standby are using a password file
    and remote_login_passwordfile is set to SHARED or EXCLUSIVE,
    and that the SYS password is same in the password files.
    returning error ORA-16191
    FAL[client, ARC2]: Error 16191 connecting to orcl11020 for fetching gap sequence
    Errors in file /<>dir>/diag/rdbms/stdby_11/stdby_11/trace/stdby_11_arc2_24321.trc:
    ORA-16191: Primary log shipping client not logged on standby
    Errors in file /<dir>/diag/rdbms/stdby_11/stdby_11/trace/stdby_11_arc2_24321.trc:
    ORA-16191: Primary log shipping client not logged on standby
    So on both primary and standby I confirmed
    SQL> show parameter remote_login_passwordfile
    NAME TYPE VALUE
    remote_login_passwordfile string EXCLUSIVE
    To make double sure that password files are same, I shutdown both databases, delete password files and recreated with same credentials.
    Password files are called - orapworcl11020 and orapwstdby_11
    Can someone guide me where thisngs are going wrong here please.

    Not sure if I understood it clearly.
    SELECT * FROM V$ARCHIVE_GAP;
    returns no rows so there is no gap.
    But could you please explain me the result of the previous query. To catch up again, on standby when I check
    SELECT SEQUENCE#,APPLIED FROM V$ARCHIVED_LOG
    SEQUENCE# APPLIED
    75 NO
    74 NO
    76 NO
    77 NO
    I understand that though archive files have been copied across but they are not applied yet.
    On primary when I give your query -
    SELECT name as STANDBY,SEQUENCE#,applied, completion_time
    2 FROM v$archived_log
    3 where dest_id=2
    4 and sequence# BETWEEN 74 and 80;
    I get -
    STANDBY SEQUENCE# APPLIED COMPLETIO
    stdby_11 74 YES 28-JUN-11
    stdby_11 75 YES 28-JUN-11
    stdby_11 76 YES 29-JUN-11
    stdby_11 77 YES 29-JUN-11
    stdby_11 78 YES 29-JUN-11
    stdby_11 79 YES 29-JUN-11
    stdby_11 80 YES 29-JUN-11
    stdby_11 75 NO 07-JUL-11
    stdby_11 74 NO 07-JUL-11
    stdby_11 76 NO 07-JUL-11
    stdby_11 77 NO 07-JUL-11
    stdby_11 78 NO 07-JUL-11
    I have intentionally given
    sequence# BETWEEN 74 and 80
    because I know in the current incarnaion of the database, max sequence is 78.
    So my understanding is, the rows between 28-29 June are from previous incarnation, correct me if I am wrong
    Archive files of the current incarnation, since I successfully created standby database are shipped but yet to be applied - am I right?
    Then my final question is, when will these archives be applied to standby database?
    I am sorry to ask too many questions but I am just trying to understand how it all works.
    Thanks for your help again

  • Problem creating dynamic component children with RestoreState of JSF 1.2

    Hello everybody,
    With JSF 1.1, it was possible to create children of a component dynamically in the constructor of a component.
    But now, with JSF 1.2, there is an issue with the RestoreState as a new instance of each component of the tree is created.
    Does someone has an idea on how to solve this issue?
    One possibility is to create the children not in the component, but in the renderer of the component.
    But I'm not really convinced of this solution...
    Thank you in advance.
    bgOnline

    You can create a new component dynamically in the method setParent() according to the following code snippet:
    public void setParent(UIComponent parent) {
    if (parent != null) {
    List<UIComponent> children = parent.getChildren();
    Application application = FacesContext.getCurrentInstance().getApplication();
         componentLabel = (HtmlOutputLabel) application
         .createComponent(HtmlOutputLabel.COMPONENT_TYPE);
         componentLabel.setTransient(true);
         children.add(componentLabel);
         } else if (parent == null) {
         if (componentLabel != null) {
         List<UIComponent> children = getParent().getChildren();
              children.remove(componentLabel);
              super.setParent(parent);
         }

  • Having problems creating a zip file with Japanese language character names

    I have a bunch of files with names in Japanese characters (also Chinese, Korean, Spanish etc, but this will do for an example). The encoding is Unicode.
    I wish to be able to put them in a zip file, then extract them again using any old zip tool (WinZip,PKZip,7-Zip etc)
    Trouble is, every time I do it, the files inside the zipfile end up with garbage character names. The contents of the files are fine, though.
    I'm aware that there used to be a bug in the java.util.zip.* classes regarding character encodings (http://bugs.sun.com/bugdatabase/view_bug.do;jsessionid=5bd4fe01ad8a7b4ec89afef5005da?bug_id=4244499) but as far as I can see it's supposed to have been fixed I have also tried the ZipOutputStream class from apache which allows you to set the encoding manually - no luck there either (just many different varieties of garbage characters)
    Test code below. What am I missing here?
    import java.io.File;
    import java.io.FileOutputStream;
    import java.io.FileReader;
    import java.io.IOException;
    public class ZipTest {
    static String uniqueFileName = "&#35199;&#12288;&#32020;.txt";
    public static void main(String[] args) {
    try {
    standardZip();
    apacheZip();
    } catch (IOException fe) {
    System.out.println("problem in file - exception " + fe.getMessage());
    public static void apacheZip() throws IOException {
    String encoding = "";
    File inputFile = new File(uniqueFileName);
    FileReader reader = new FileReader(inputFile);
    encoding = reader.getEncoding();
    System.out.println("Using Apache - Encoding = " + encoding);
    File zipFile = new File("apacheZip.zip");
    org.apache.tools.zip.ZipOutputStream zipOutputStream = new org.apache.tools.zip.ZipOutputStream(
    zipFile);
    zipOutputStream.setEncoding(encoding);
    org.apache.tools.zip.ZipEntry zipEntry = new org.apache.tools.zip.ZipEntry(uniqueFileName);
    zipEntry.setSize(inputFile.length());
    zipEntry.setTime(inputFile.lastModified());
    zipOutputStream.putNextEntry(zipEntry);
    int c = 0;
    while (c >= 0) {
    c = reader.read();
    if (c >= 0) {
    zipOutputStream.write(c);
    zipOutputStream.closeEntry();
    zipOutputStream.finish();
    public static void standardZip() throws IOException {
    String encoding = "";
    File inputFile = new File(uniqueFileName);
    FileReader reader = new FileReader(inputFile);
    encoding = reader.getEncoding();
    System.out.println("Using Java IO - Encoding = " + encoding);
    File zipFile = new File("standardZip.zip");
    FileOutputStream zipOut = new FileOutputStream(zipFile);
    java.util.zip.ZipOutputStream zipOutputStream = new java.util.zip.ZipOutputStream(zipOut);
    java.util.zip.ZipEntry zipEntry = new java.util.zip.ZipEntry(uniqueFileName);
    zipEntry.setSize(inputFile.length());
    zipEntry.setTime(inputFile.lastModified());
    zipOutputStream.putNextEntry(zipEntry);
    int c = 0;
    while (c >= 0) {
    c = reader.read();
    if (c >= 0) {
    zipOutputStream.write(c);
    zipOutputStream.closeEntry();
    zipOutputStream.finish();

    Emma_Baillie wrote:
    I have a bunch of files with names in Japanese characters (also Chinese, Korean, Spanish etc, but this will do for an example). The encoding is Unicode.
    I wish to be able to put them in a zip file, then extract them again using any old zip tool (WinZip,PKZip,7-Zip etc)
    Trouble is, every time I do it, the files inside the zipfile end up with garbage character names. The contents of the files are fine, though.This is becuase zip tool doesn't support unicode for zip entries. For example WinZip prior to 11.2 does not support Unicode characters in filenames. You need to look for the simillar information for other tools. You can find more on tools on their website.

  • Problems creating hrp1000/1001 infotypes with HR function RH_INSERT_INFTY

    Hi,
    I'm trying to use fm <b>RH_INSERT_INFTY</b> in an rfc to create Personnel Planning infotypes 1000 and 1001. (I.e. to create a position.) They are linked, so I've tried creating using update type (vtask) of 'D' (Dialog) as well as 'B' (Buffer update) to just update the buffer with the 1000 and 1001 before applying the update. Whichever way I try I get the error "Creation of object ID 00000000 is not allowed" when I call fm RH_INSERT_INFTY.
    Any ideas what I'm doing wrong please.
    Can you create an infotype 1000 and 1001 in one call to the fm ? Or do you have to call it twice, once for each infotype ?
    Thanks,
    Craig.

    Hi Craig
    To create an HR object, try the FM <b>"RH_CREATE_OBJECT"</b>. To create relation data you can use <b>"RH_INSERT_INFTY_1001".</b>
    Or you can make use of the HR BAPI written for this.
    For HR objects (comprising also PD) you can use the Business Object <b>"HRMasterDataReplica"</b>. It has a class method as <b>"SaveReplicaMultiple"</b> where you can pass your data to the interface.
    You fill <b>"BAPIHROBJ" and "BAPIHRINF"</b> mainly and other infotype-related parameters. e.g. for infotype 1000 <b>"PdObject"</b> (structure like p1000) and for 1001 <b>"PdObjectRelationships"</b> (structure like p1001).
    These are BO names, you can use the transaction code <b>"BAPI"</b> for more information.
    The BAPI for this method is discussed also at BAPI_HRMASTER_SAVE_REPL_MULT.
    <i>And as a last thing, let me introduce you the SDN forums pointing system: You can assign points to posts you find helpful while solving your question. You can reward points by clicking the yellow star icon at header of each reply post. You can reward;
    - one 10 points (solved)
    - two 6 points (very helpful answer)
    - many 2 points (helpful answer)</i>
    Kind Regards...
    *--Serdar

Maybe you are looking for

  • How do i connect to the internet with my ipad?

    how do i connect to the internet with my ipad?

  • Workflow Activities not executing in Parallel

    Hi, I am using Oracle Workflow as a dependency management system to load a Data Warehouse. Workflow Activities are used to execute a custom PL/SQL wrapper that kick off Oracle Warehouse Builder (OWB) mappings that load target tables with source syste

  • To know details of PO/PR

    Dear friends, please furnish the following details How to find out 1.  purchase orders   not closed (i.e remaining open) even after delivery and the payments for same is made? 2.po for which there is no RFQ 3.Po for which there is no PR 4.expected da

  • System Landscape RFC User

    Hi, I have a simple question. I would like to be able to monitor my erp system(on another host) from solman. Using wizard, In the set-up system landscape-> create system step, under the main instance selection, what should I choose to add my erp syst

  • Class is public, should be declared in a file?

    So I'm running a program called BankAccount.java. When I tried to comile it (javac BankAccount.java) I get a message that says BankAccount.java:4:class InsufficientFundsException is public, should be declared in a file named InsufficientFundsExceptio