Problem with Data source- full load is not extracting where as delta can.

Hi All,
I have a problem in loading of the region values into BW.
Data Source 0CRM_OPPT_I is used to fetch opportunities ( same as Sales Orders in ECC) from CRM system.Data source has a field 0CRM_TR which gives the values of region values pof the employee responsible. problem here is that CRM_TR value is not being fetched when the record was first created, but the value is being flown to BW in the delta run i.e. when a change happened in record.
Please anyone help us out in resolving this issue.
Regards,
Haaris.

Hi Hari.......
Hav u done init........if yes........What u hav done....I mean.....init with or w/o data transfer.....?
It has picked records or not? if yes...........after init.........hav u checked the Delta queue(RSA7) in the source system...........is the Total field is 0 or not.........if 0..then....
Look if u r using Direct delta.then after init.any changes or new records will get recorded in the Delta queue(RSA7) directly..........but if the Update mode is Unserialized V3 update or Qued delta.then data will accumulated in the Update table and the Extraction Queue respectively...........in that case u hav to schedule V3 jobs to pull them in RSA7............otherwise delta load will pick 0 records only........
Check the update mode of the Datasource in LBWE.......
Check this..........it may help u........
Steps to do init load from CRM
Regards,
Debjani.......

Similar Messages

  • I´m having some problems with dates intervals. It´s not working well. I inserted the correct date between the right interval but the system refers to incorrect date? What shoul I do?

    Mensagem editada por: Mauricio Galletti
    It seens that there´s some problem with the specific date of 19/10/2014? Don´t know why? I just remove this date from the interval and the problem gone.
    What´s wrong with 19/10/2014?

    Hello Galdr,
    Welcome to the HP Forums.
    I see that after doing some updates, you've lost the use of switchable graphics in your Notebook. I will do my best to help you with this.
    You can try is to use the HP Recovery Manager. This will allow you to "recover" the Notebook and reinstall the original drivers. This document: Using Recovery Manager to Restore Software and Drivers (Windows 7), can assist you with that.
    Once the drivers are "reverted", then if you wish you can attempt updating.
    This first thing I will provide you in regards to updating is this document: Switchable Graphics on Notebooks Configured with Dual AMD GPUs. There is more general information than anything, but you may find it useful.
    When going to update you will want to use the website or the HP Support Assistant. These are the recommend ways to update. When using AMD's (or other OEM utilities) they will look for compatible drivers for the component, but not necessarily ones that work with your Notebook. What most likely happened, only one of the two drivers were actually updated.
    Here is a link to the newest driver provided by HP: AMD High-Definition (HD) Graphics Driver. Also, here's a link to using the HP Support Assistant: Using HP Support Assistant (Windows 7). (in case you wanted to learn more/use this method).
    I hope this is able to help you get back to "state 1", so that you may be able to update correctly and have full functionality of your switchable graphics again. Please let me know how this goes for you. Thank you for posting on the HP Forums.
    I worked on behalf of HP.

  • Data Source - Full Load

    Hi Friends, Please help me with this requirement , I have a Standard content datasource that have plant, factory, Industry units and Machine Areas fields with  "valid to" and "valid from date". Now this datasource does not support the delta,
    1. can you help me how to make the full load on monthly basis, I have calendar day in DSO which i map to valid from field
    2. Is there any way to make this Standard content <b>datasource delta capable</b> to load delta to ods. No OSS notes available for this
    Please send me step bby step code to make the delta
    Thanks
    Poonam

    It is not possible to make generic datasource since too many tables are having the data fields. please suggest any other method
    Thanks
    Poonam

  • I have one problem with Data Guard. My archive log files are not applied.

    I have one problem with Data Guard. My archive log files are not applied. However I have received all archive log files to my physical Standby db
    I have created a Physical Standby database on Oracle 10gR2 (Windows XP professional). Primary database is on another computer.
    In Enterprise Manager on Primary database it looks ok. I get the following message “Data Guard status Normal”
    But as I wrote above ”the archive log files are not applied”
    After I created the Physical Standby database, I have also done:
    1. I connected to the Physical Standby database instance.
    CONNECT SYS/SYS@luda AS SYSDBA
    2. I started the Oracle instance at the Physical Standby database without mounting the database.
    STARTUP NOMOUNT PFILE=C:\oracle\product\10.2.0\db_1\database\initluda.ora
    3. I mounted the Physical Standby database:
    ALTER DATABASE MOUNT STANDBY DATABASE
    4. I started redo apply on Physical Standby database
    alter database recover managed standby database disconnect from session
    5. I switched the log files on Physical Standby database
    alter system switch logfile
    6. I verified the redo data was received and archived on Physical Standby database
    select sequence#, first_time, next_time from v$archived_log order by sequence#
    SEQUENCE# FIRST_TIME NEXT_TIME
    3 2006-06-27 2006-06-27
    4 2006-06-27 2006-06-27
    5 2006-06-27 2006-06-27
    6 2006-06-27 2006-06-27
    7 2006-06-27 2006-06-27
    8 2006-06-27 2006-06-27
    7. I verified the archived redo log files were applied on Physical Standby database
    select sequence#,applied from v$archived_log;
    SEQUENCE# APP
    4 NO
    3 NO
    5 NO
    6 NO
    7 NO
    8 NO
    8. on Physical Standby database
    select * from v$archive_gap;
    No rows
    9. on Physical Standby database
    SELECT MESSAGE FROM V$DATAGUARD_STATUS;
    MESSAGE
    ARC0: Archival started
    ARC1: Archival started
    ARC2: Archival started
    ARC3: Archival started
    ARC4: Archival started
    ARC5: Archival started
    ARC6: Archival started
    ARC7: Archival started
    ARC8: Archival started
    ARC9: Archival started
    ARCa: Archival started
    ARCb: Archival started
    ARCc: Archival started
    ARCd: Archival started
    ARCe: Archival started
    ARCf: Archival started
    ARCg: Archival started
    ARCh: Archival started
    ARCi: Archival started
    ARCj: Archival started
    ARCk: Archival started
    ARCl: Archival started
    ARCm: Archival started
    ARCn: Archival started
    ARCo: Archival started
    ARCp: Archival started
    ARCq: Archival started
    ARCr: Archival started
    ARCs: Archival started
    ARCt: Archival started
    ARC0: Becoming the 'no FAL' ARCH
    ARC0: Becoming the 'no SRL' ARCH
    ARC1: Becoming the heartbeat ARCH
    Attempt to start background Managed Standby Recovery process
    MRP0: Background Managed Standby Recovery process started
    Managed Standby Recovery not using Real Time Apply
    MRP0: Background Media Recovery terminated with error 1110
    MRP0: Background Media Recovery process shutdown
    Redo Shipping Client Connected as PUBLIC
    -- Connected User is Valid
    RFS[1]: Assigned to RFS process 2148
    RFS[1]: Identified database type as 'physical standby'
    Redo Shipping Client Connected as PUBLIC
    -- Connected User is Valid
    RFS[2]: Assigned to RFS process 2384
    RFS[2]: Identified database type as 'physical standby'
    Redo Shipping Client Connected as PUBLIC
    -- Connected User is Valid
    RFS[3]: Assigned to RFS process 3188
    RFS[3]: Identified database type as 'physical standby'
    Primary database is in MAXIMUM PERFORMANCE mode
    Primary database is in MAXIMUM PERFORMANCE mode
    RFS[3]: No standby redo logfiles created
    Redo Shipping Client Connected as PUBLIC
    -- Connected User is Valid
    RFS[4]: Assigned to RFS process 3168
    RFS[4]: Identified database type as 'physical standby'
    RFS[4]: No standby redo logfiles created
    Primary database is in MAXIMUM PERFORMANCE mode
    RFS[3]: No standby redo logfiles created
    10. on Physical Standby database
    SELECT PROCESS, STATUS, THREAD#, SEQUENCE#, BLOCK#, BLOCKS FROM V$MANAGED_STANDBY;
    PROCESS STATUS THREAD# SEQUENCE# BLOCK# BLOCKS
    ARCH CONNECTED 0 0 0 0
    ARCH CONNECTED 0 0 0 0
    ARCH CONNECTED 0 0 0 0
    ARCH CONNECTED 0 0 0 0
    ARCH CONNECTED 0 0 0 0
    ARCH CONNECTED 0 0 0 0
    ARCH CONNECTED 0 0 0 0
    ARCH CONNECTED 0 0 0 0
    ARCH CONNECTED 0 0 0 0
    ARCH CONNECTED 0 0 0 0
    ARCH CONNECTED 0 0 0 0
    ARCH CONNECTED 0 0 0 0
    ARCH CONNECTED 0 0 0 0
    ARCH CONNECTED 0 0 0 0
    ARCH CONNECTED 0 0 0 0
    ARCH CONNECTED 0 0 0 0
    ARCH CONNECTED 0 0 0 0
    ARCH CONNECTED 0 0 0 0
    ARCH CONNECTED 0 0 0 0
    ARCH CONNECTED 0 0 0 0
    ARCH CONNECTED 0 0 0 0
    ARCH CONNECTED 0 0 0 0
    ARCH CONNECTED 0 0 0 0
    ARCH CONNECTED 0 0 0 0
    ARCH CONNECTED 0 0 0 0
    ARCH CONNECTED 0 0 0 0
    ARCH CONNECTED 0 0 0 0
    ARCH CONNECTED 0 0 0 0
    ARCH CONNECTED 0 0 0 0
    ARCH CONNECTED 0 0 0 0
    RFS IDLE 0 0 0 0
    RFS IDLE 0 0 0 0
    RFS IDLE 1 9 13664 2
    RFS IDLE 0 0 0 0
    10) on Primary database:
    select message from v$dataguard_status;
    MESSAGE
    ARC0: Archival started
    ARC1: Archival started
    ARC2: Archival started
    ARC3: Archival started
    ARC4: Archival started
    ARC5: Archival started
    ARC6: Archival started
    ARC7: Archival started
    ARC8: Archival started
    ARC9: Archival started
    ARCa: Archival started
    ARCb: Archival started
    ARCc: Archival started
    ARCd: Archival started
    ARCe: Archival started
    ARCf: Archival started
    ARCg: Archival started
    ARCh: Archival started
    ARCi: Archival started
    ARCj: Archival started
    ARCk: Archival started
    ARCl: Archival started
    ARCm: Archival started
    ARCn: Archival started
    ARCo: Archival started
    ARCp: Archival started
    ARCq: Archival started
    ARCr: Archival started
    ARCs: Archival started
    ARCt: Archival started
    ARCm: Becoming the 'no FAL' ARCH
    ARCm: Becoming the 'no SRL' ARCH
    ARCd: Becoming the heartbeat ARCH
    Error 1034 received logging on to the standby
    Error 1034 received logging on to the standby
    LGWR: Error 1034 creating archivelog file 'luda'
    LNS: Failed to archive log 3 thread 1 sequence 7 (1034)
    FAL[server, ARCh]: Error 1034 creating remote archivelog file 'luda'
    11)on primary db
    select name,sequence#,applied from v$archived_log;
    NAME SEQUENCE# APP
    C:\ORACLE\PRODUCT\10.2.0\ORADATA\IRINA\ARC00003_0594204176.001 3 NO
    C:\ORACLE\PRODUCT\10.2.0\ORADATA\IRINA\ARC00004_0594204176.001 4 NO
    Luda 4 NO
    Luda 3 NO
    C:\ORACLE\PRODUCT\10.2.0\ORADATA\IRINA\ARC00005_0594204176.001 5 NO
    Luda 5 NO
    C:\ORACLE\PRODUCT\10.2.0\ORADATA\IRINA\ARC00006_0594204176.001 6 NO
    Luda 6 NO
    C:\ORACLE\PRODUCT\10.2.0\ORADATA\IRINA\ARC00007_0594204176.001 7 NO
    Luda 7 NO
    C:\ORACLE\PRODUCT\10.2.0\ORADATA\IRINA\ARC00008_0594204176.001 8 NO
    Luda 8 NO
    12) on standby db
    select name,sequence#,applied from v$archived_log;
    NAME SEQUENCE# APP
    C:\ORACLE\PRODUCT\10.2.0\ORADATA\LUDA\ARC00004_0594204176.001 4 NO
    C:\ORACLE\PRODUCT\10.2.0\ORADATA\LUDA\ARC00003_0594204176.001 3 NO
    C:\ORACLE\PRODUCT\10.2.0\ORADATA\LUDA\ARC00005_0594204176.001 5 NO
    C:\ORACLE\PRODUCT\10.2.0\ORADATA\LUDA\ARC00006_0594204176.001 6 NO
    C:\ORACLE\PRODUCT\10.2.0\ORADATA\LUDA\ARC00007_0594204176.001 7 NO
    C:\ORACLE\PRODUCT\10.2.0\ORADATA\LUDA\ARC00008_0594204176.001 8 NO
    13) my init.ora files
    On standby db
    irina.__db_cache_size=79691776
    irina.__java_pool_size=4194304
    irina.__large_pool_size=4194304
    irina.__shared_pool_size=75497472
    irina.__streams_pool_size=0
    *.audit_file_dest='C:\oracle\product\10.2.0\admin\luda\adump'
    *.background_dump_dest='C:\oracle\product\10.2.0\admin\luda\bdump'
    *.compatible='10.2.0.1.0'
    *.control_files='C:\oracle\product\10.2.0\oradata\luda\luda.ctl'
    *.core_dump_dest='C:\oracle\product\10.2.0\admin\luda\cdump'
    *.db_block_size=8192
    *.db_domain=''
    *.db_file_multiblock_read_count=16
    *.db_file_name_convert='luda','irina'
    *.db_name='irina'
    *.db_unique_name='luda'
    *.db_recovery_file_dest='C:\oracle\product\10.2.0\flash_recovery_area'
    *.db_recovery_file_dest_size=2147483648
    *.dispatchers='(PROTOCOL=TCP) (SERVICE=irinaXDB)'
    *.fal_client='luda'
    *.fal_server='irina'
    *.job_queue_processes=10
    *.log_archive_config='DG_CONFIG=(irina,luda)'
    *.log_archive_dest_1='LOCATION=C:/oracle/product/10.2.0/oradata/luda/ VALID_FOR=(ALL_LOGFILES, ALL_ROLES) DB_UNIQUE_NAME=luda'
    *.log_archive_dest_2='SERVICE=irina LGWR ASYNC VALID_FOR=(ONLINE_LOGFILES, PRIMARY_ROLE) DB_UNIQUE_NAME=irina'
    *.log_archive_dest_state_1='ENABLE'
    *.log_archive_dest_state_2='ENABLE'
    *.log_archive_max_processes=30
    *.log_file_name_convert='C:/oracle/product/10.2.0/oradata/irina/','C:/oracle/product/10.2.0/oradata/luda/'
    *.open_cursors=300
    *.pga_aggregate_target=16777216
    *.processes=150
    *.remote_login_passwordfile='EXCLUSIVE'
    *.sga_target=167772160
    *.standby_file_management='AUTO'
    *.undo_management='AUTO'
    *.undo_tablespace='UNDOTBS1'
    *.user_dump_dest='C:\oracle\product\10.2.0\admin\luda\udump'
    On primary db
    irina.__db_cache_size=79691776
    irina.__java_pool_size=4194304
    irina.__large_pool_size=4194304
    irina.__shared_pool_size=75497472
    irina.__streams_pool_size=0
    *.audit_file_dest='C:\oracle\product\10.2.0/admin/irina/adump'
    *.background_dump_dest='C:\oracle\product\10.2.0/admin/irina/bdump'
    *.compatible='10.2.0.1.0'
    *.control_files='C:\oracle\product\10.2.0\oradata\irina\control01.ctl','C:\oracle\product\10.2.0\oradata\irina\control02.ctl','C:\oracle\product\10.2.0\oradata\irina\control03.ctl'
    *.core_dump_dest='C:\oracle\product\10.2.0/admin/irina/cdump'
    *.db_block_size=8192
    *.db_domain=''
    *.db_file_multiblock_read_count=16
    *.db_file_name_convert='luda','irina'
    *.db_name='irina'
    *.db_recovery_file_dest='C:\oracle\product\10.2.0/flash_recovery_area'
    *.db_recovery_file_dest_size=2147483648
    *.dispatchers='(PROTOCOL=TCP) (SERVICE=irinaXDB)'
    *.fal_client='irina'
    *.fal_server='luda'
    *.job_queue_processes=10
    *.log_archive_config='DG_CONFIG=(irina,luda)'
    *.log_archive_dest_1='LOCATION=C:/oracle/product/10.2.0/oradata/irina/ VALID_FOR=(ALL_LOGFILES, ALL_ROLES) DB_UNIQUE_NAME=irina'
    *.log_archive_dest_2='SERVICE=luda LGWR ASYNC VALID_FOR=(ONLINE_LOGFILES, PRIMARY_ROLE) DB_UNIQUE_NAME=luda'
    *.log_archive_dest_state_1='ENABLE'
    *.log_archive_dest_state_2='ENABLE'
    *.log_archive_max_processes=30
    *.log_file_name_convert='C:/oracle/product/10.2.0/oradata/luda/','C:/oracle/product/10.2.0/oradata/irina/'
    *.open_cursors=300
    *.pga_aggregate_target=16777216
    *.processes=150
    *.remote_login_passwordfile='EXCLUSIVE'
    *.sga_target=167772160
    *.standby_file_management='AUTO'
    *.undo_management='AUTO'
    *.undo_tablespace='UNDOTBS1'
    *.user_dump_dest='C:\oracle\product\10.2.0/admin/irina/udump'
    Please help me!!!!

    Hi,
    After several tries my redo logs are applied now. I think in my case it had to do with the tnsnames.ora. At this moment I have both database in both tnsnames.ora files using the SID and not the SERVICE_NAME.
    Now I want to use DGMGRL. Adding a configuration and a stand-by database is working fine, but when I try to enable the configuration DGMGRL gives no feedback and it looks like it is hanging. The log, although says that it succeeded.
    In another session 'show configuration' results in the following, confirming that the enable succeeded.
    DGMGRL> show configuration
    Configuration
    Name: avhtest
    Enabled: YES
    Protection Mode: MaxPerformance
    Fast-Start Failover: DISABLED
    Databases:
    avhtest - Primary database
    avhtestls53 - Physical standby database
    Current status for "avhtest":
    Warning: ORA-16610: command 'ENABLE CONFIGURATION' in progress
    It there anybody that experienced the same problem and/or knows the solution to this?
    With kind regards,
    Martin Schaap

  • Problem with date format from Oracle DB

    Hi,
    I am facing a problem with date fields from Oracle DB sources. The date format of the field in DB table is 'Date base type is DATE and DDIC type is DATS'.
    I mapped the date fields to Date characters in BI. Now the data that comes to PSA is in weird format. It shows like -0.PR.09-A
    I have changing the field settings in DataSource  to internal and external and also i have tried mapping these date fields to text fields with out luck. All delivers the same format.
    I have also tried using conversion routines like, CONVERSION_EXIT_IDATE_INPUT to change format. It also delivers me the same old result.
    If anybody of you have any suggestions or if anybody have you experienced such probelms, Please share your experience with me.
    Thanks in advance.
    Regards
    Varada

    Thanks for all your reply. I can only the solutions creating view in database. I want some solution to be done in BI. I appreciate if some of you have idea in it.
    The issue again in detail
    I am facing an issue with date fields from oracle data. The data that is sent from Oracle is in the format is -0.AR.04-M. I am able to convert this date in BI with conversion routine in BI into format 04-MAR-0.
    The problem is,  I am getting data of length 10 (Output format) in the format -0.AR.04-M where the month is not in numericals. Since it is in text it is taking one character spacing more.
    I have tried in different ways to convert and increased the length in BI, the result is same. I am wondering if we can change the date format in database.
    I am in puzzle with the this date format. I have checked other Oracle DB connections data for date fields in BI, they get data in the format 20.081.031 which will allow to convert this in BI. Only from the system i am trying creating a problem.
    Regards
    Varada

  • Problem in Data Source

    Hello,
    The database connection is not created from data-sources.xml in the server start up. I am really worried as where exactly the problem is...!!! Everywhere, it is mentioned that, OC4J configures the settings from the \config directory during server start.
    I wonder as why there is a problem in database connection. Below is the data-sources.xml file in config directory.
    <?xml version="1.0"?>
    <!DOCTYPE data-sources PUBLIC "-//Evermind//- Data-sources configuration" "http://xmlns.oracle.com/ias/dtds/data-sources.dtd">
    <data-sources>
         <data-source
              class="com.evermind.sql.DriverManagerDataSource"
              name="OracleDS"
              location="jdbc/OracleCoreDS"
              xa-location="jdbc/xa/OracleXADS"
              ejb-location="jdbc/OracleDS"
              connection-driver="oracle.jdbc.driver.OracleDriver"
              username="scott"
              password="tiger"
              url="jdbc:oracle:thin:@localhost:5521:oracle"
              inactivity-timeout="30"
         />
    </data-sources>
    Should i initialize any item before database connection? I have tried this with JDeveloper and Stand alone OC4J. Connection was made to the database. But i dont understand why is the problem in remote machine.
    Kindly suggest me as how i can proceed...
    Thanx and Regards,
    Achyuth

    Achyuth -- Data source min connections were not always honored in releases prior to v9021. In releases v9021 and later the behavior is honored with certain restrictions that are described in the documentation. From reading another posting you made it appears that you are on v1022x so connections are not made on startup but on use.
    Thanks -- Jeff

  • URGENT ! JDEV 10.1.2 Problem with data control generated from session bean

    I got a problem with data control generated from session bean which return a collection of data transfer object.
    The dto's seem to be correct. The session bean load correctly the data into and the object's are plenty of data. Using the console to display the dto content is ok.
    When generating a data control from this session bean and associate the dto included in the collection only the first object level and one-to-one dto object are correctly setted in the data control. Object that represent collection into the dto (one-to-many foreign key) are setted as collection with an iterator but the structure of the object is not setted. I don't know how to associate this second level of collection with the dto bean class to obtain the attributes definition.
    I created a case with hr schema like the hrApp demo application in the tutorial with departments and employees table. I got the same problem.
    Is it a bug ?
    It exists a workaround to force the data control to understand the collection data structure ?
    Help is welcome ! this is urgent !!!

    we found the problem by assigning the child dto bean class to the node representing the iterator in the xml file corresponding to the master dto.

  • Failed to acquire DAS with data sources that match the specified OCE

    Please help, Thank you all.
    We are migrating Oracle databases from Windows to Linux servers. Currently, I have no problem to use my desktop Hyperion Intelligence Designer 8.5 with new oce file to connect to Linux. However, I published the same oce file to Hyperion Performance Suite 8.5 then I got “Sever Error [2018]: Failed to acquire Data Access Service with data sources that match the specified OCE”.
    I launched Service Configurator, in the “Local Service Configurator - DAS1_host”, the “Database JDBC URL is point to Linux server. Also, I added ODBC connection to Oracle database on Linux in the “Properties of Data Access Service: DAS1_host’s Data Source”. Database of JDBC URL in BI1_host, AZ1_host, PUB1_host, SM1_host, LS1_host, UT1_host, BPS1_host, AN1_host are also point to Linux server.
    The “Remote Service Configurator” has ES1_host’s” Storage” with JDBC URL point to Linux server. The same for RM1_host’s and NS1_host’s “Storage” with JDBC URL point to Linux server.
    Is there some missing configuration on the server? Thank you in advance.

    You should login to my oracle support to look into this ID.
    This is the solution mentioned
    Solution
    1) In the .profile file, you will find the definition of the LD_LIBRARY_PATH variable. However, the path to 64 bit libraries is probably appended before the path to the 32 bit libraries(%ORACLE_HOME%/lib32). Therefore, please reverse the order in the definition so that the 32 bit libraries get loaded before the 64 bit libraries.
    Then, restart all services.
    2) Oracle 9.2 was the first Oracle release which defaults to using the 64 bit libraries in the 'lib' directory and the 32-bit libraries that we need are in the 'lib32' directory.
    Therefore, please have the%ORACLE_HOME%/lib32 placed in front of the LD_LIBRARY_PATH environment variable definition.Then, restart all services.
    3) Please check that users have read and write access to the files located in the %ORACLE_HOME%/lib32 folder.
    4) Check the permissions of the libclntsh.a file(located in the %ORACLE_HOME%/lib32 folder) are set to rwxr-xr-x
    5) Recreate the OCE connection in DAS and restart DAS in CMC.
    6) Restart all the BIPlus Services and process the query.

  • Problem with data

    hai
         i have problem with data load,where for 12th month i dont have any actual data(related to finance) in r/3 but when i saw in cube i have data,i dont know how the data is coming from r/3 without any actual data.if so how can i check whether that data come from r/3 or some other place and if i found that how can i delete that data ,from where should i delete that data.please help to solve this issue

    Hi,
    If there is data in Infocube,  there should be request. use content of data target option in manage and  select the any one data record which is in scope and find the request number. by using this find the same request in Request tab of manage infocube. click on monitor symbol in that selected request, which take you to monitor screen where you can find the source system and which IP, and DataSource used in uploading.
    if there is no ALE scenario in your system landscape then there is no possibilities of missing data in R/3 and BW. so check once again the data flow and availability.
    Reg,
    Vish.

  • Having a problem with dates when I send my numbers doc to excel. dates are all out and that they have to cut and paste individual entries onto their spreadsheet. Any idea how I can prevent this

    having a problem with dates when I send my numbers doc to excel. dates are all out and that they have to cut and paste individual entries onto their spreadsheet. Any idea how I can prevent this.
    I'm using Lion on an MBP and Numbers is the latest version

    May you give more details about what is wrong with your dates ?
    M…oSoft products aren't allowed on my machines but I use LibreOffice which is a clone of Office.
    When I export from Numbers to Excel and open the result with LibreOffice, the dates are correctly treated.
    To be precise, dates after 01/01/1904 are correctly treated. dates before 01/01/1904 are exported as strings but, as it's flagged during the export process, it's not surprising.
    Yvan KOENIG (VALLAURIS, France) mardi 3 janvier 2012
    iMac 21”5, i7, 2.8 GHz, 12 Gbytes, 1 Tbytes, mac OS X 10.6.8 and 10.7.2
    My iDisk is : http://public.me.com/koenigyvan
    Please : Search for questions similar to your own before submitting them to the community
    For iWork's applications dedicated to iOS, go to :
    https://discussions.apple.com/community/app_store/iwork_for_ios

  • Problem with Date format

    Got one more problem Merilyn and Radhakrishnan...
    Regarding the soln y provided me earler with the thread "Problem with date format"...
    What is happening is....I am able to change the 2400 to 0000 but when it is changed from 2400 on jan 1st to 0000 the hour is changing but not the date....the date still remains as jan 1st instead of jan 2nd....
    Eg: Jan 1st 2400 -- changed to -- jan1st 0000
    instead of jan 2nd 0000
    Could you please help me in this issue...
    Thanks,
    GK

    GK, try this...
    decode(instr(packagename.functionname(param1 ,param2),'2400'), 0, to_date(to_char(to_date(rtrim(packagename.functionname(param1 ,param2),'(PT)'), 'Month dd, yyyy "at" hh24mi'),'mm/dd/yyyy hh24mi'),'mm/dd/yyyy hh24mi'),
                                                                      to_date(to_char(to_date(rtrim(packagename.functionname(param1 ,param2),'(PT)'), 'Month dd, yyyy "at" "2400"')+1,'mm/dd/yyyy "0000"'),'mm/dd/yyyy "0000"'))-Marilyn

  • Problem with date format dd/mm/yyyy. But I need to convert yyyy-mm-dd.

    Dear friends,
    I have the problem with date format. I receiving the date with the format dd/mm/yyyy. But I can upload to MySQL only in the format of yyyy-mm-dd.
    how should I handle this situation, for this I've created these code lines.But I have some problem with these line. please help me to solve this problem.
    String pattern = "yyyy-mm-dd";
    SimpleDateFormat format = new SimpleDateFormat(pattern);
    try {
    Date date = format.parse("2006-02-12");
    System.out.println(date);
    } catch (ParseException e) {
    e.printStackTrace();
    System.out.println(format.format(new Date()));
    this out put gives me Tue Apr 03 00:00:00 IST 2007
    But I need the date format in yyyy-mm-dd.
    regards,
    maza
    thanks in advance.

    Thanks Dear BalusC,
    I tried with this,
    rs.getString("DATA_SCAD1")// where the source from .xls files
    String pattern = "yyyy-MM-dd";
    SimpleDateFormat format = new SimpleDateFormat(pattern);
    try {
    Date date = format.parse("DATA_SCAD1");
    System.out.println(date);
    } catch (ParseException e) {
    e.printStackTrace();
    System.out.println(format.format(new Date()));
    this out put gives me Tue Apr 03 00:00:00 IST 2007
    But I want to display the date format in yyyy-mm-dd.
    regards,
    maza

  • Tp ended with error code 0247 - addtobuffer has problems with data- and/or

    Hello Experts,
    If you give some idea, it will be greatly appreciated.
    This transported issue started coming after power outage, sap system went hard shutdown.
    Then we brought up the system. Before that , we do not have this transport issue.
    our TMS landscape is
    DEV QA-PRD
    SED-SEQSEP
    DEV is having the TMS domain controller.
    FYI:
    *At OS level, when we do scp command using root user, it is fine for any TR.
    In STMS, while adding TR in SEQ(QA system), we  are getting error like this.
    Error:
    Transport control program tp ended with error code 0247
         Message no. XT200
    Diagnosis
         An error occurred when executing a tp command.
           Command:        ADDTOBUFFER SEDK906339 SEQ client010 pf=/us
           Return code:    0247
           Error text:     addtobuffer has problems with data- and/or
           Request:        SEDK906339
    System Response
         The function terminates.
    Procedure
         Correct the error and execute the command again if necessary.
    This is tp version 372.04.71 (release 700, unicode enabled)
    Addtobuffer failed for SEDK906339.
      Neither datafile nor cofile exist (cofile may also be corrupted).
    standard output from tp and from tools called by tp:
    tp returncode summary:
    TOOLS: Highest return code of single steps was: 0
    ERRORS: Highest tp internal error was: 0247

    when we do scp using sm69,
    SEDADM@DEVSYS:/usr/sap/trans/cofiles/K906339.SED SEQADM@QASYS:/usr/sap/trans/cofiles/.
    it throws the error like below,
    Host key verification failed.
                                                                                    External program terminated with exit code 1
    Thanks
    Praba

  • Data source CFMXBible.mdb could not be found.

    Hello
    I am following the book CFMXBible and I got it set up.
    but in the beginning of the tutorial there is a form that is
    suppose to sent the contents of my form to the db.
    After I press the submit button, I get this error’
    What is wrong here?
    What should I do?
    thanks
    The web site you are accessing has experienced an unexpected
    error.
    Please contact the website administrator.
    The following information is meant for the website developer
    for debugging purposes.
    Error Occurred While Processing Request
    Data source CFMXBible.mdb could not be found.
    Resources: · Enable Robust Exception Information to
    provide greater detail about the source of errors. In the
    Administrator, click Debugging & Logging > Debugging
    Settings, and select the Robust Exception Information option.
    · Check the ColdFusion documentation to verify that you are
    using the correct syntax. · Search the Knowledge Base to find
    a solution to your problem.
    Browser Mozilla/4.0 (compatible; MSIE 6.0; Windows NT 5.1;
    SV1; .NET CLR 1.0.3705; .NET CLR 1.1.4322)
    Remote Address 127.0.0.1
    Referrer
    http://server:8500/cfmxbible/ch02/companyaddform.cfm
    Date/Time 05-May-06 01:23 PM

    Hello
    I am following the book CFMXBible and I got it set up.
    but in the beginning of the tutorial there is a form that is
    suppose to sent the contents of my form to the db.
    After I press the submit button, I get this error’
    What is wrong here?
    What should I do?
    thanks
    The web site you are accessing has experienced an unexpected
    error.
    Please contact the website administrator.
    The following information is meant for the website developer
    for debugging purposes.
    Error Occurred While Processing Request
    Data source CFMXBible.mdb could not be found.
    Resources: · Enable Robust Exception Information to
    provide greater detail about the source of errors. In the
    Administrator, click Debugging & Logging > Debugging
    Settings, and select the Robust Exception Information option.
    · Check the ColdFusion documentation to verify that you are
    using the correct syntax. · Search the Knowledge Base to find
    a solution to your problem.
    Browser Mozilla/4.0 (compatible; MSIE 6.0; Windows NT 5.1;
    SV1; .NET CLR 1.0.3705; .NET CLR 1.1.4322)
    Remote Address 127.0.0.1
    Referrer
    http://server:8500/cfmxbible/ch02/companyaddform.cfm
    Date/Time 05-May-06 01:23 PM

  • Problem with setting Source Level in Sun Studio 2

    I've got problem with setting Source Level to 1.5 in Sun Studio 2. When I try to set it to 1.5 in Project properties and click Ok everything seem to go well, but when I open Project Properties again Source Level is set to 1.4. I need this to work cause I started to lear Java recently and I want to use foreach loop.
    Please help

    I'm just citing an example using Date().
    In fact, whether I use DateFormat or Calendar, it shows the same result.
    When I set the date to 1 Jan 1950 0 hours 0 minutes 0 seconds,
    jdk1.4.2 will always return me 1 Jan 1950 0 hours 10 minutes 0 seconds.
    It works correctly under jdk1.3.1

Maybe you are looking for

  • Calling a method

    Hi to everybody. I have written a method �type public int getMax(double [] may) return i; when i try to call it just this.getMax(may[1250]) the compiler gives the error Wrong number of arguments in method Please, help Thank you in advance

  • Not sure how to go about this program....

    i have an assignment for my java class that requires me to create a program to simulate a gunfight between 3 people bob, aaron, and charlie. bob's shooting accuracy is 50% (the probablility of making a kill is 1 out of 2), aaron's accuracy is 30%, an

  • Display only uses middle third of screen, how to go full width?

    Using win7-64, IE8 displays correctly on large crt, most web sites using FF4 use about the 50% of the display width, centered ok, some text is also overlaid, but top 15%(tools, bookmarks, etc ) are full wideth and ok

  • Macbook Pro slow wake up time, HDD + SSD

    A couple of months ago i upgraded my 2012 MacBook Pro (non retina) with an 500GB Samsung 840EVO SSD. Boot times were blazing fast as well as wake up time from sleep. After I installed the original HDD in the super drive bay as second storage, the wak

  • BI - SUS ( Self Service Procurement)

    Hello Everyone, I am pulling the procurement data into BI from SRM 5.0, now the functional team is asking me to pull (SUS PO, SUS PO Respone, SUS ASN and SUS Invoice related information) from SUS information. I checked the business content informatio