Change Data Services Log Folder Path

Hello,
How can I change the default location of LOG folder in Business Objects Data Services
Thanks

you want to change the default log folder for job server or for all DS applications ?
I don't think you will be able to change the default log folder of all DS applications
for job server default log folder setting AL_JobServerLogDirectory1 is not enough, you need to set the following 2, in the DSConfig.txt and both the values should be set, set the AL_JobServerLogReDir1 to complete path without the sub folder name and AL_JobServerLogDirectory1 to just the name of sub folder
for example:- you want to set the log folder to C:\JobServerLogs\log
then set the values as below
AL_JobServerLogReDir1=C:\JobServerLogs
AL_JobServerLogDirectory1=log
if you have multiple job servers then repalce the 1 suffix with the number of that job server

Similar Messages

  • Chnage Data Services Log Folder Path

    Hello,
    How can I change the default location of LOG folder in Business Objects Data Services
    Thanks

    you want to change the default log folder for job server or for all DS applications ?
    I don't think you will be able to change the default log folder of all DS applications
    for job server default log folder setting AL_JobServerLogDirectory1 is not enough, you need to set the following 2, in the DSConfig.txt and both the values should be set, set the AL_JobServerLogReDir1 to complete path without the sub folder name and AL_JobServerLogDirectory1 to just the name of sub folder
    for example:- you want to set the log folder to C:\JobServerLogs\log
    then set the values as below
    AL_JobServerLogReDir1=C:\JobServerLogs
    AL_JobServerLogDirectory1=log
    if you have multiple job servers then repalce the 1 suffix with the number of that job server

  • Archive - change date format for tftp path

    Hi NG,
    HW: WS-C2960S-48FPS-L
    IOS: c2960s-universalk9-mz.122-58.SE1.bin
    config:
    archive
    log config
    logging enable
    logging size 10
    notify syslog contenttype plaintext
    hidekeys
    path tftp://IP.IP.IP.IP/path/$h.cfg
    now show archive shows me following:
    sh archive:
    The next archive file will be named tftp://IP.IP.IP.IP/path/hostname.cfg-<timestamp>-0
    a) what if i do not wish to have the timestamp appended?
    if i specify the path like following i can atleast choose where the timestamp is inserted:
    path tftp://IP.IP.IP.IP/path/$h__$t.cfg
    now the file received on the tftp server looks like the following:
    hostname__Jul--1-18-48-20.cfg-0
    i don't like the format, is there any way to change or set it manually?
    if not, i come back to question a), can i somehow turn it off?
    thank you

    The overall s***y format cannot be changed, but the 'service timestamps log' can change some of what gets logged.
    I would find the people responsible and fire them.  While I commend the adding of a timestamp, the absolute lack of control over it's format and usage is absolutely horrible.  Who in their right mind thinks that's a good format? Add to it, there's no way to turn off the now completely unnecessary revision tag. (the "-#" at the end.)
    If I had a covered (read: non-EOL'd) device, you bet there'd be bugs logged against it.

  • How to change/apply new home folder path to 350+ users?

    quick intro:
    ...had a screwy server, planned a rebuild.
    added all active users to a specific group.
    Exported said group, exported ALL users, exported anything I could think of!
    Have vanilla built 10.4.10 server, few teething probs with a DNS glitch, think I'm over it...
    not 100 percent sure, but 4 users I've tested appear to be working.
    imported specific group, didnt show any members.
    Had to import ALL users before they populated, unfortunatley ALL users appear to have active accounts, only want specific group active
    (was I expecting to much?)
    In a bid to tidy up home dirs, correct privelages etc.
    is there a simple way to apply the same privs to hundreds of folders rather than selecting each one?
    I basically want <user> as the owner, <admin> as the group, everyone else none.
    Also need to change/apply a new home dir path to 350plus users
    What is the easiest way to do this?
    TIA
    a very tired and frustrated system admin :-\

    eeeff wrote:
    quick intro:
    ...had a screwy server, planned a rebuild.
    added all active users to a specific group.
    Exported said group, exported ALL users, exported anything I could think of!
    Have vanilla built 10.4.10 server, few teething probs with a DNS glitch, think I'm over it...
    not 100 percent sure, but 4 users I've tested appear to be working.
    imported specific group, didnt show any members.
    Had to import ALL users before they populated, unfortunatley ALL users appear to have active accounts, only want specific group active
    (was I expecting to much?)
    Use the Search feature in WGM (upper right icon). If you want to apply a setting to say all users in the group "Teachers" and the GID is 100 then you would search and "Group ID IS 100. Check the box preform a batch edit. Now make any changes you want and save. WGM will prompt you with a list of users the change will apply to. Also if you don't check the box you will see a list of all users in that group, you can use this to delete all the members of another group OR export only the members in a group (highlight them all after a search).
    In a bid to tidy up home dirs, correct privelages etc.
    is there a simple way to apply the same privs to hundreds of folders rather than selecting each one?
    I basically want <user> as the owner, <admin> as the group, everyone else none.
    The following terminal command will apply privileges to every folder inside the current folder. Note that the folder name and the short name for the user must match exactly. cd to the directory with home folders first.
    sudo for i in `ls`;do chown -R $i $i;done
    (cut and paste to reduce the chance of devastating typo
    Also need to change/apply a new home dir path to 350plus users
    What is the easiest way to do this?
    TIA
    a very tired and frustrated system admin :-\

  • Data Services 4.0 Designer. Job Execution but empty log file no matter what

    Hi all,
    am running DS 4.0. When i execute my batch_job via designer, log window pops up but is blank. i.e. cannot see any trace messages.
    doesn't matter if i select "Print all trace messages" in execution properties.
    Jobserver is running on a seperate server. The only thing i have locally is just my designer.
    if i log into the Data Services management console and select the job server, i can see trace and error logs from the job. So i guess what i need is for this stuff to show up in my designer?
    Did i miss a step somewhere?
    can't find anything in docs about this.
    thanks
    Edited by: Andrew Wangsanata on May 10, 2011 11:35 AM
    Added additional detail

    awesome. Thanks Manoj
    I found the log file. in it relevant lines for last job i ran are
    (14.0) 05-11-11 16:52:27 (2272:2472) JobServer:  Starting job with command line -PLocaleUTF8 -Utip_coo_ds_admin
                                                    -P+04000000001A030100100000328DE1B2EE700DEF1C33B1277BEAF1FCECF6A9E9B1DA41488E99DA88A384001AA3A9A82E94D2D9BCD2E48FE2068E59414B12E
                                                    48A70A91BCB  -ek********  -G"70dd304a_4918_4d50_bf06_f372fdbd9bb3" -r1000 -T1073745950  -ncollect_cache_stats
                                                    -nCollectCacheSize  -ClusterLevelJOB  -Cmxxx -CaDesigner -Cjxxx -Cp3500 -CtBatch  -LocaleGV
                                                    -BOESxxx.xxx.xxx.xxx -BOEAsecLDAP -BOEUi804716
                                                    -BOEP+04000000001A0301001000003F488EB2F5A1CAB2F098F72D7ED1B05E6B7C81A482A469790953383DD1CDA2C151790E451EF8DBC5241633C1CE01864D93
                                                    72DDA4D16B46E4C6AD -Sxxx.xxx.xxx -NMicrosoft_SQL_Server -Qlocal_repo  coo ds local
                                                    repo_azdzgq4dnuxbm4xeriey1_e" -l"C:\Program Files (x86)\SAP BusinessObjects\Data Services/log/js01/tip coo ds local
                                                    repo_azdzgq4dnuxbm4xeriey1_e/trace_05_11_2011_16_52_27_9__70dd304a_4918_4d50_bf06_f372fdbd9bb3.txt" -z"C:\Program Files
                                                    (x86)\SAP BusinessObjects\Data Services/log/js01/tip coo ds local
                                                    repo_azdzgq4dnuxbm4xeriey1_e/error_05_11_2011_16_52_27_9__70dd304a_4918_4d50_bf06_f372fdbd9bb3.txt" -w"C:\Program Files
                                                    (x86)\SAP BusinessObjects\Data Services/log/js01/tip coo ds local
                                                    repo_azdzgq4dnuxbm4xeriey1_e/monitor_05_11_2011_16_52_27_9__70dd304a_4918_4d50_bf06_f372fdbd9bb3.txt" -Dt05_11_2011_16_52_27_9
                                                    (BODI-850052)
    (14.0) 05-11-11 16:52:27 (2272:2472) JobServer:  StartJob : Job '05_11_2011_16_52_27_9__70dd304a_4918_4d50_bf06_f372fdbd9bb3' with pid '148' is kicked off
                                                    (BODI-850048)
    (14.0) 05-11-11 16:52:28 (2272:2072) JobServer:  Sending notification to <inet:10.165.218.xxx:56511> with message type <4> (BODI-850170)
    (14.0) 05-11-11 16:52:28 (2272:2472) JobServer:  AddChangeInterest: log change interests for <05_11_2011_16_52_27_9__70dd304a_4918_4d50_bf06_f372fdbd9bb3> from client
                                                    <inet:10.165.218.xxx:56511>. (BODI-850003)
    (14.0) 05-11-11 17:02:32 (2272:2472) JobServer:  RemoveChangeInterest: log change interests for <05_11_2011_16_52_27_9__70dd304a_4918_4d50_bf06_f372fdbd9bb3> from client
                                                    <inet:10.165.218.xxx:56511>. (BODI-850003)
    (14.0) 05-11-11 19:57:45 (2272:2468) JobServer:  GetRunningJobs() success. (BODI-850058)
    (14.0) 05-11-11 19:57:45 (2272:2468) JobServer:  PutLastJobs Success.  (BODI-850001)
    (14.0) 05-11-11 19:57:45 (2272:2072) JobServer:  Sending notification to <inet:10.165.218.xxx:56511> with message type <5> (BODI-850170)
    (14.0) 05-11-11 19:57:45 (2272:2472) JobServer:  GetHistoricalLogStatus()  Success. 05_11_2011_16_52_27_9__70dd304a_4918_4d50_bf06_f372fdbd9bb3 (BODI-850001)
    (14.0) 05-11-11 19:57:45 (2272:2472) JobServer:  GetHistoricalLogStatus()  Success. 05_11_2011_16_52_27_9__70dd304a_4918_4d50_bf06_f372fdbd9bb3 (BODI-850001)
    it does not look like i have any errors with respect to connectivity? ( or any errors at all....)
    Please advise on what, if anything you notice from log file and/or next steps i can take.
    thanks.

  • Data Services Designer 14 - Large Log files

    Hello,
    we're running several jobs with the Data Services Designer 14, all works fine.
    But today a problem occur:
    The Data Designer on a client produces after finishing a big job a very large log file in the Data Services Designer folder with 8 GB.
    Is it possible to delete these log files automatically or restrict the maximum size of the created log files in the designer?
    What's the best way?
    Thanks!

    You can set to automatically delete the log files based on number of days.
    I have done this in XI 3.2, but as per the document, this is how it can be done in DS 14.0.
    In DS 14.0, this is handled in CMC.
    1. Log into the Central Management Console (CMC) as a user with administrative rights to the Data Services application.
    2. Go to the u201CApplicationsu201D management area of the CMC. The u201CApplicationsu201D dialog box appears.
    3. Right-click the Data Services application and select Settings. The u201CSettingsu201D dialog box appears.
    4. In the Job Server Log Retention Period box, enter the number of days that you want to retain the following:
    u2022 Historical batch job error, trace, and monitor logs
    u2022 Current service provider trace and error logs
    u2022 Current and historical Access Server logs
    The software deletes all log files beyond this period. For example:
    u2022 If you enter 1, then the software displays the logs for today only. After 12:00 AM, these logs clear and the software begins saving logs for the next day.
    u2022 If you enter 0, then no logs are maintained.
    u2022 If you enter -1, then no logs are deleted.
    Regards,
    Suneer.

  • Data Services 4.1 upgrade failed

    Hi Everyone,
    We came across a very really strange situation when we try to upgrade our DS from 4.0 SP3 to DS 4.1 SP2.
    Before the upgrade, our DSConfig.txt file is in the bin folder where DS is installed (E:\SAP BusinessObject\Data Services\bin).  After the upgrade, we discovered a second DSConfig.txt file was created and saved to C:\Program Files\SAP BusinessObjects\Data Services\conf folder.  How is this possible when during the upgrade we specifically pointed to E:\SAP BusinessObject location?  More importantly which DSConfig.txt file should we use?  Right now we can't connect to our repositories through the CMC.
    TIA

    Also, do not worry about the change in the directory. In the release notes it  is mentioned that Linked path is changed to common directory path.
    Configuration 
    Old location:
    <LINK_DIR>/bin/DSConfig.txt
    New Location:
    <DS_COMMON_DIR>/conf/DSCon
    fig.txt
    It is weird why SAP has to change the paths. I think your upgrade went well.
    Please check the release notes documents for other changes.
    http://help.sap.com/businessobject/product_guides/sboDS41/en/sbo411_ds_whats_new_en.pdf
    Good Luck

  • .bat file result not giving correctly in data services

    Hi ,
    I am using .bat file to count the number of files in the folder. I want to capture the value of the number of files.
    I have written small .bat file as shown below
    @echo off
    setlocal enableextensions
    CD %1
    set cnt=0
    for %%A in (*) do set /a cnt+=1
    echo File count = %cnt%
    endlocal
    I have used this batch  file in the script as shown below.
    $COUNT =exec('D:\\test.bat', 'D:\\docs',8);
    print($COUNT);
    In the above $COUNT is varchar(255).
    actually i have only 11 file sin the docs folder . But it is giving the 442 files.
    If I run the same batch file in the windows cmd command I am getting the correct result.
    Please let me know what will be issues?
    I am unable figure out the issues for weird result when I am using DS.
    Please help me in this issue.
    Thanks & Regards,
    Ramana.

    Hi,
    While executing the batch file it is taking the Data services installation directory path as shown below.
    14504    6420    PRINTFN    3/30/2015 9:40:10 AM    0: C:\Program Files (x86)\SAP BusinessObjects\Data Services\bin  File count = 442
    I have changed the the batch file with two parameters. one for changing directory and another one for changing the folder.
    Now my batch file looks like this.
    @echo off
    setlocal enableextensions
    %1:
    CD %2
    echo %CD%
    set cnt=0
    for %%A in (*) do set /a cnt+=1
    echo File count = %cnt%
    endlocal
    Then my script is like this .
    $COUNT =exec('D:\\test.bat', ' D D:\\docs',8);
    print($COUNT);
    Now it is giving the expected result as shown below.
    14700    13912    PRINTFN    3/30/2015 9:52:18 AM    0: D:\Docs  File count = 11
    Thanks & Regards,
    Ramana.

  • Jobs gets hanged when a call is made to PLSQL function in Data Services XI

    Hi,
    I am facing the below issue after migration of BODI 11.7 to BODS XI 3.1.
    The job is not proceeding after the below mentioned statements.
    print('before call');
    $is_job_enable=DS_TEST.TEST.MY_PKG.IS_JOB_ENABLED(job_name());
    print($is_job_enable);
    MY_PKG.IS_JOB_ENABLED plsql function will return Number.
    $is_job_enable is a global variable declared as decimal (10, 0).
    This Job works fine in Data Integrator 11.7.3 version and gets handed in Data Services XI 3.1.
    I tried changing the global variable $is_job_enable to int and created new data sources before doesn't solve the problem. Can anyone tell me what is the issue?
    Thanks & Regards
    Maran MK
    The trace file says
    5260     3284     JOB     5/5/2009 4:43:17 AM     Job <TEST_JOB> is started.
    5260     3284     PRINTFN     5/5/2009 4:43:17 AM     before call
    5260     3284     SP     5/5/2009 4:43:18 AM     Stored procedure call <MY_PKG.IS_JOB_ENABLED> is started.
    5260     3284     SP     5/5/2009 4:43:18 AM     SQL query submitted for stored procedure call <MY_PKG.IS_JOB_ENABLED> is: <BEGIN :AL_SP_RETURN :=
    5260     3284     SP     5/5/2009 4:43:18 AM     "TEST"."MY_PKG"."IS_JOB_ENABLED"("P_JOB_NAME" => :P_JOB_NAME); END;
    5260     3284     SP     5/5/2009 4:43:18 AM     >.
    5260     3284     SP     5/5/2009 4:43:18 AM     Stored procedure call <E> input parameter <P> has value of <TEST_JOB>.
    5260     3284     SP     5/5/2009 4:43:18 AM     Stored procedure call <E> return value is <1.0000000>.
    5260     3284     SP     5/5/2009 4:43:18 AM     Stored procedure call <MY_PKG.IS_JOB_ENABLED> is done.
    The below error occurs only in Windows and not in Linux environment.
    5260     3284     SYS-170101     5/5/2009 4:43:21 AM     |Session TEST_JOB
    5260     3284     SYS-170101     5/5/2009 4:43:21 AM     System Exception <ACCESS_VIOLATION> occurred. Process dump is written to <E:\Program Files\Business Objects\Data
    5260     3284     SYS-170101     5/5/2009 4:43:21 AM     Services\log\BODI_MINI20090505044318_5260.DMP> and <E:\Program Files\Business Objects\Data
    5260     3284     SYS-170101     5/5/2009 4:43:21 AM     Services\log\BODI_FULL20090505044318_5260.DMP>
    5260     3284     SYS-170101     5/5/2009 4:43:21 AM     Process dump is written to <E:\Program Files\Business Objects\Data Services\log\BODI_MINI20090505044318_5260.DMP> and
    5260     3284     SYS-170101     5/5/2009 4:43:21 AM     <E:\Program Files\Business Objects\Data Services\log\BODI_FULL20090505044318_5260.DMP>
    5260     3284     SYS-170101     5/5/2009 4:43:21 AM     Call stack:
    5260     3284     SYS-170101     5/5/2009 4:43:21 AM     001B:00CA9EAB, ActaDecimalImpl<RWFixedDecimal<RWMultiPrecisionInt<3> >,RWMultiPrecisionInt<3>,ActaDecimal28,char
    5260     3284     SYS-170101     5/5/2009 4:43:21 AM     [29]>::operator=()0315 byte(s), x:\src\rww\actadecimalimpl.cpp, line 13140004 byte(s)
    5260     3284     SYS-170101     5/5/2009 4:43:21 AM     001B:00D8A267, Convert()+0999 byte(s), x:\src\eval\calc.cpp, line 0303
    5260     3284     SYS-170101     5/5/2009 4:43:21 AM     001B:00DBF9E0, XVal_cast::compute()+0272 byte(s), x:\src\core\compute.cpp, line 1664
    5260     3284     SYS-170101     5/5/2009 4:43:21 AM     001B:00DBC239, XStep_assn::execute()+0057 byte(s), x:\src\core\step.cpp, line 0069
    5260     3284     SYS-170101     5/5/2009 4:43:21 AM     001B:00DBB30D, XStep_sblock::execute()+0029 byte(s), x:\src\core\step.cpp, line 0707
    5260     3284     SYS-170101     5/5/2009 4:43:21 AM     001B:00DBB30D, XStep_sblock::execute()+0029 byte(s), x:\src\core\step.cpp, line 0707
    5260     3284     SYS-170101     5/5/2009 4:43:21 AM     001B:00DBE0BC, XPlan_spec::execute()+0348 byte(s), x:\src\core\plan.cpp, line 0082
    5260     3284     SYS-170101     5/5/2009 4:43:21 AM     001B:00DC5EA0, XPlan_desc::execute()+0336 byte(s), x:\src\core\xplan.cpp, line 0153
    5260     3284     SYS-170101     5/5/2009 4:43:21 AM     001B:00DBD68E, XPlan_spec::compute()0206 byte(s), x:\src\core\plan.cpp, line 01450011 byte(s)
    5260     3284     SYS-170101     5/5/2009 4:43:21 AM     001B:00DBD891, XPlan_spec::compute()+0225 byte(s), x:\src\core\plan.cpp, line 0244
    5260     3284     SYS-170101     5/5/2009 4:43:21 AM     001B:0074533A, AE_Main_Process_Options()+31498 byte(s), x:\src\xterniface\actamainexp.cpp, line 3485
    5260     3284     SYS-170101     5/5/2009 4:43:21 AM     001B:00747EDA, AE_Main()1498 byte(s), x:\src\xterniface\actamainexp.cpp, line 07680030 byte(s)
    5260     3284     SYS-170101     5/5/2009 4:43:21 AM     001B:004029F9
    5260     3284     SYS-170101     5/5/2009 4:43:21 AM     Registers:
    5260     3284     SYS-170101     5/5/2009 4:43:21 AM     EAX=0000000E  EBX=03E392E0  ECX=04B455A0  EDX=012346D8  ESI=02B75D88
    5260     3284     SYS-170101     5/5/2009 4:43:21 AM     EDI=04B455A0  EBP=00212738  ESP=002124BC  EIP=00CA9EAB  FLG=00210206
    5260     3284     SYS-170101     5/5/2009 4:43:21 AM     CS=001B   DS=0023  SS=0023  ES=0023   FS=003B  GS=0000
    5260     3284     SYS-170101     5/5/2009 4:43:21 AM     Exception code: C0000005 ACCESS_VIOLATION
    5260     3284     SYS-170101     5/5/2009 4:43:21 AM     Fault address:  00CA9EAB 01:00585EAB E:\Program Files\Business Objects\Data Services\bin\acta.dll

    Hi Manoj & Tiji,
    Thanks for your comments. Please find the below outcome.
    print($is_job_enable); -- is not executed if PLSQL function is called.
    I changed $is_job_enable to VARCHAR, still the same issue.
    I created new project and executed the same in new job, still the same issue (all objects are new except Datastore).
    The dmp happens only when we the PLSQL function is called. I commented the Function call, the execution proceeds further but got hanged in other PLSQL function call (different than the 1st one)
    Is this bug in 12.1?
    Can you tell any Hot fix available? If possible please give me the SAP Notes Number.
    Is there any other way to execute the PLSQL functions/procedures in 12.1?
    Thanks
    Maran MK

  • Directory structure for the new Data Services Project

    1) I do as prescribed in the manual "Building and deploying
    Flex 2 Applications", page 325
    "To create a web application make a copy of the /flex
    directory and its contents. Rename the copy and store it in the
    same location under /servers/default directory."
    ("flex" is an empty Flex Data Services application that
    serves as a template for creating your custom application)
    2) I create a corresponding project from Flex Builder 2 :
    Project type: Flex Data Services
    Root folder: C:\fds2\jrun4\servers\default\MyDS
    Root URL:
    http://localhost:8700/default/MyDS
    Project name: MyDS
    Project contents: C:\fds2\jrun4\servers\default\MyDS
    2) I build the project
    Immediately after "build project" the directory structure at
    C:\fds2\jrun4\servers\default\MyDS becomes the following:
    .settings
    bin
    ----------------META-INF
    ----------------WEB-INF
    ---------------- --------------- classes
    ---------------- ---------------flex
    --------------------------------jsp
    --------------------------------lib
    -------------------------------sessions
    html-template
    META-INF
    WEB-INF
    ----------------classes
    ----------------flex
    ----------------jsp
    ----------------lib
    ----------------sessions
    Notice that bin directory now contains another pair of
    META-INF and WEB-INF in addition to those already existing in the
    template project "flex".
    Can anybody comment on this directory structure?
    Which META-INF and WEB-INF are supposed to be used for
    configuration?
    What is the purpose of having two pairs of META-INF and
    WEB-INF in the same web app?

    Hello -
    first, those folders are necessary in deployment - You need
    only the contents of the bin folder for deployment, not the
    sources. Since you're compiling the application locally in FB2 it
    places all of the supporting and necessary files into one location
    namely the "bin" folder. You'd deploy the "bin" folder's contents
    to the FDS server, perhaps another FDS server that is not your
    "development" server -- like a production server. The data and
    configuration information that your app needs for FDS services are
    stored in the WEB-INF and META-INF folders so these need to travel
    with the final product. On the production server you'd just cop the
    "bin" folder and it's contents to the /servers/default folder -
    where you could then rename your bin folder to "MyDS"
    HTH, Bill

  • Data services job failes while insert data into SQL server from Linux

    SAP data services (data quality) server is running on LInux server and Windows. Data services jobs which uses the ODBC driver to connect to SQL server is failing after selecting few thousand records with following reason as per data services log on Linux server. We can run the same data services job from Windows server, the only difference here is it is using SQL server drivers provided by microsoft. So the possible errors provided below, out of which #1 and #4 may not be the reason of job failure. DBA checked on other errors and confirmed that transaction log size is unlimited and system has space.
    Why the same job runs from Windows server and fails from Linux ? It is because the ODBC drivers from windows and Linux works in different way? OR there is conflict in the data services job with ODBC driver.
    ===== Error Log ===================
    8/25/2009 11:51:51 AM Execution of <Regular Load Operations> for target <DQ_PARSE_INFO> failed. Possible causes: (1) Error in the SQL syntax;
    (2)6902 3954215840 RUN-051005 8/25/2009 11:51:51 AM Database connection is broken; (3) Database related errors such as transaction log is full, etc.; (4) The user defined in the
    6902 3954215840 RUN-051005 8/25/2009 11:51:51 AM datastore has insufficient privileges to execute the SQL. If the error is for preload or postload operation, or if it is for
    ===== Error Log ===================

    this is another method
    http://www.mssqltips.com/sqlservertip/2484/import-data-from-microsoft-access-to-sql-server/
    Please Mark This As Answer if it helps to solve the issue Visakh ---------------------------- http://visakhm.blogspot.com/ https://www.facebook.com/VmBlogs

  • Business Objects Data Services Incremental Load Help

    Hi this my first time creating a incremental load for a batch job. My batch job consists of a try - initialization script - data flow - catch. When I validate my initialization script I am getting an error could you review and identify the error with the script. My data flow consists of the data store table I imported with a query then table comparison then key generation then the the table I am updating.
    # Set Todays Date
    $SYSDATE = cast ( sysdate (), 'date' );
    print ('Today\' date:' || cast($SYSDATE, 'varchar(10)'));
    # SET CDC DATE
    $CDC_DATE = nvl (cast(sql('Target', 'SELECT MAX(BATCH_END_DATE) FROM BATCH_CONTROL WHERE BATCH_NAME = {$BATCH_NAME}
    AND BATCH_STATUS = \'SUCESS\' '), 'date'), cast(to_date('1900-01-01', 'YYYY-MM-DD'), 'date'));
    #Mark an entry in Batch_Control
    # Batch_Name    BATCH_STATUS   BATCH_START_DATE   BATCH_END_DATE Load_DATE
    sql('Target', 'INSERT INTO BATCH_CONTROL VALUES ( {BATCH_NAME}, \'STARTED', {to_char ($CDC_DATE, \'YYYY-MM-DD\')}, NULL, {to_char ($SYSDATE, \'YYYY-MM-DD\')};

    So I resolved the first error now I am receiving this long error any ideas?
    13388
    15908
    SYS-170101
    5/22/2014 10:39:54 AM
    |Session Table_Incramental_Load
    13388
    15908
    SYS-170101
    5/22/2014 10:39:54 AM
    System Exception <ACCESS_VIOLATION> occurred. Process dump is written to <C:\Program Files (x86)\Business
    13388
    15908
    SYS-170101
    5/22/2014 10:39:54 AM
    Objects\BusinessObjects Data Services\log\BODI_MINI20140522103951_13388.DMP> and <C:\Program Files (x86)\Business
    13388
    15908
    SYS-170101
    5/22/2014 10:39:54 AM
    Objects\BusinessObjects Data Services\log\BODI_FULL20140522103951_13388.DMP>
    13388
    15908
    SYS-170101
    5/22/2014 10:39:54 AM
    Process dump is written to <C:\Program Files (x86)\Business Objects\BusinessObjects Data
    13388
    15908
    SYS-170101
    5/22/2014 10:39:54 AM
    Services\log\BODI_MINI20140522103951_13388.DMP> and <C:\Program Files (x86)\Business Objects\BusinessObjects Data
    13388
    15908
    SYS-170101
    5/22/2014 10:39:54 AM
    Services\log\BODI_FULL20140522103951_13388.DMP>
    13388
    15908
    SYS-170101
    5/22/2014 10:39:54 AM
    Call stack:
    13388
    15908
    SYS-170101
    5/22/2014 10:39:54 AM
    0023:00D305BA, TrCallStatement::process_dbdiff_xform_new()+6666 byte(s), x:\src\parser\process_predef_xform.cpp, line 7281
    13388
    15908
    SYS-170101
    5/22/2014 10:39:54 AM
    0023:00D3128E, TrCallStatement::process_diff_xform()+1422 byte(s), x:\src\parser\process_predef_xform.cpp, line 0432
    13388
    15908
    SYS-170101
    5/22/2014 10:39:54 AM
    0023:00D356EE, TrCallStatement::process_predef_xform_options()+0286 byte(s), x:\src\parser\process_predef_xform.cpp, line
    13388
    15908
    SYS-170101
    5/22/2014 10:39:54 AM
    0067+0017 byte(s)
    13388
    15908
    SYS-170101
    5/22/2014 10:39:54 AM
    0023:00C313A5, TrCallStatement::processStatement()+0789 byte(s), x:\src\parser\dataflowstm.cpp, line 3307
    13388
    15908
    SYS-170101
    5/22/2014 10:39:54 AM
    0023:00C310FC, TrCallStatement::processStatement()+0108 byte(s), x:\src\parser\dataflowstm.cpp, line 3201+0012 byte(s)
    13388
    15908
    SYS-170101
    5/22/2014 10:39:54 AM
    0023:00C0FB55, DataFlowDef::processStatements()+0101 byte(s), x:\src\parser\dataflow.cpp, line 2331+0014 byte(s)
    13388
    15908
    SYS-170101
    5/22/2014 10:39:54 AM
    0023:00C110D5, DataFlowDef::buildGraph()+1621 byte(s), x:\src\parser\dataflow.cpp, line 1723
    13388
    15908
    SYS-170101
    5/22/2014 10:39:54 AM
    0023:00C12D99, DataFlowDef::processObjectDef()+2793 byte(s), x:\src\parser\dataflow.cpp, line 1290
    13388
    15908
    SYS-170101
    5/22/2014 10:39:54 AM
    0023:00CB9DC5, CallStep::processStep()+2037 byte(s), x:\src\parser\planstep.cpp, line 1050
    13388
    15908
    SYS-170101
    5/22/2014 10:39:54 AM
    0023:FFFFFFFF, NsiAllocateAndGetPersistentDataWithMaskTable()+-1997676757 byte(s)
    13388
    15908
    SYS-170101
    5/22/2014 10:39:54 AM
    0023:00CB443C, CompoundStep::processStep()+0044 byte(s), x:\src\parser\planstep.cpp, line 4460+0044 byte(s)
    13388
    15908
    SYS-170101
    5/22/2014 10:39:54 AM
    0023:00CB406F, TryStep::processStep()+0335 byte(s), x:\src\parser\planstep.cpp, line 3634
    13388
    15908
    SYS-170101
    5/22/2014 10:39:54 AM
    0023:00CB33A6, Step::processStepBlock()+0134 byte(s), x:\src\parser\planstep.cpp, line 0377+0018 byte(s)
    13388
    15908
    SYS-170101
    5/22/2014 10:39:54 AM
    0023:00CB443C, CompoundStep::processStep()+0044 byte(s), x:\src\parser\planstep.cpp, line 4460+0044 byte(s)
    13388
    15908
    SYS-170101
    5/22/2014 10:39:54 AM
    0023:00C8A78E, PlanDef::processObjectDef()+2718 byte(s), x:\src\parser\plandef.cpp, line 0689
    13388
    15908
    SYS-170101
    5/22/2014 10:39:54 AM
    0023:00ABB806, AE_Main_Process_Options()+32534 byte(s), x:\src\xterniface\actamainexp.cpp, line 3622
    13388
    15908
    SYS-170101
    5/22/2014 10:39:54 AM
    0023:00ABFAB1, AE_Main()+1505 byte(s), x:\src\xterniface\actamainexp.cpp, line 0830+0030 byte(s)
    13388
    15908
    SYS-170101
    5/22/2014 10:39:54 AM
    0023:00402AE9
    13388
    15908
    SYS-170101
    5/22/2014 10:39:54 AM
    Registers:
    13388
    15908
    SYS-170101
    5/22/2014 10:39:54 AM
    EAX=056E85F0  EBX=00000000  ECX=00000010  EDX=02250048  ESI=056E85F0
    13388
    15908
    SYS-170101
    5/22/2014 10:39:54 AM
    EDI=056E85A8  EBP=04A7C590  ESP=002700F0  EIP=00D305BA  FLG=00010206
    13388
    15908
    SYS-170101
    5/22/2014 10:39:54 AM
    CS=0023   DS=002B  SS=002B  ES=002B   FS=0053  GS=002B
    13388
    15908
    SYS-170101
    5/22/2014 10:39:54 AM
    Exception code: C0000005 ACCESS_VIOLATION
    13388
    15908
    SYS-170101
    5/22/2014 10:39:54 AM
    Fault address:  00D305BA 01:0029A5BA C:\Program Files (x86)\Business Objects\BusinessObjects Data Services\bin\acta.dll

  • Debug mode crashes Data Services ?

    Hi,
    I have installed BO Data Services XI 3.2 (ver. 12.2.0.1) on Windows Server 2003.
    Every time I run the job in debug mode Data Services crashes with error:
    SYS-170101  21.10.2009 10:59:03     |Session job_test
    SYS-170101  21.10.2009 10:59:03     System Exception <ACCESS_VIOLATION> occurred. Process dump is written to <C:\Program Files\Business Objects\BusinessObjects
    SYS-170101  21.10.2009 10:59:03     Data Services\log\BODI_MINI20091021105901_5720.DMP> and <C:\Program Files\Business Objects\BusinessObjects Data
    SYS-170101  21.10.2 009 10:59:03     Services\log\BODI_FULL20091021105901_5720.DMP>
    SYS-170101  21.10.2009 10:59:03     Process dump is written to <C:\Program Files\Business Objects\BusinessObjects Data
    SYS-170101  21.10.2009 10:59:03     Services\log\BODI_MINI20091021105901_5720.DMP> and <C:\Program Files\Business Objects\BusinessObjects Data
    SYS-170101  21.10.2009 10:59:03     Services\log\BODI_FULL20091021105901_5720.DMP>
    SYS-170101  21.10.2009 10:5:03     Call stack:
    SYS-170101  21.10.2009 10:59:03     001B:009E96D6, DataFlowDef::preOptProcessing()1366 byte(s), x:\src\compiler\dataflow.cpp, line 12420005 byte(s)
    SYS-170101  21.10.2009 10:59:03     001B:009 E8C56, DataFlowDef::optimize()0102 byte(s), x:\src\compiler\dataflow.cpp, line 030+
    SYS-170101 21.10.2009 10:59:03      001B:009EF3EA, CompoundStep::optimize()0026 byte(s), x:\src\compiler\step.cpp, line 04590025 byte(s)
    SYS-170101  21.10.2009 10:59:03     001B:007BAD07, AE_Main_Process_Options()32023 byte(s), x:\src\xterniface\actamainexp.cpp, line 3542+
    SYS-170101  21.10.2009 10:59:03     001B:007BEE01, AE_Main()1505 byte(s), x:\src\xterniface\actamainexp.cpp, line 08110030 byte(s)
    SYS-170101  21.10.2009 10:59:03     001B:00402AE9
    SYS-170101  21.10.2009 10:59:03     Registers:
    SYS-170101  21.10.2009 10:59:03     EAX=00000000  EBX=02C4CFB0  ECX=02BFE728  EDX=02C4F890  ESI=033C40B8
    SYS-170101  21.10.2009 10:59:03     EDI=00000008  EBP=02CB3668  ESP=00212674  EIP=009E96D6  FLG=00210202
    SYS-170101  21.10.2009 10:59:03     CS=001B   DS=0023  SS=0023  ES=0023   FS=003B  GS=0000
    SYS-170101  21.10.2009 10:59:03     Exception code: C0000005 ACCESS_VIOLATION
    SYS-170101  21.10.2009 10:59:03     Fault address:  009E96D6 01:002536D6 C:\Program Files\Business Objects\BusinessObjects Data Services\bin\acta.dll
    Thanks
    Georg

    Do you use any DQ transform or Data_Transfer?  If so, this is a bug that we will fix for later release.

  • Change source path in batch Job in global variable in data services

    Hi Experts,
    my organization has created job in data services 3.2 to cleanse the data reading from excel flat files. the folder path was store in the global variable(I think) and now they have changed the directories hence is it throwing me below error.
    Error, Input file  does not exist please confirm existence and restart job, 16 ) >
    failed, due to error <50316>: <>>> Error, Input file  does not exist please confirm existence and restart job>. I want to update the folder path. I am sure it would be easy but I am very new to BODS.
    (12.2) 07-15-14 16:10:08 (14232:12656)  PRINTFN: > 'JOB DEBUG' : '>>> Sleeping for 35.000000 seconds...  '
    (12.2) 07-15-14 16:10:43 (14232:12656)  PRINTFN: > 'JOB DEBUG' : '>>> Waking up......  '
    (12.2) 07-15-14 16:10:43 (14232:12656)  PRINTFN: > 'JOB DEBUG' : 'Starting the timer loop number 6...'
    (12.2) 07-15-14 16:10:43 (14232:12656) WORKFLOW: Work flow <WF_Metadata_Files> is started.
    (12.2) 07-15-14 16:10:43 (14232:12656)  PRINTFN: > 'JOB DEBUG' : '>>> $G_FILENAME_IN : ALL_Metadata_SALES.xls...'
    (12.2) 07-15-14 16:10:43 (14232:12656)  PRINTFN: > 'JOB DEBUG' : '>>> looking for input file name
                                                     \\infra\finance\production\sales\Metadata\ALL_Metadata_SALES.xls'
    (12.2) 07-15-14 16:11:08 (14232:12656)  PRINTFN: > 'JOB DEBUG' : '>>>  Input file Name is '
    (12.2) 07-15-14 16:11:08 (14232:12656)  PRINTFN: > 'JOB ERROR' : '>>> Error, Input file  does not exist please confirm existence and restart job'
    I want to update the folder path\\infra\finance\production\sales\Metadata\ALL_Metadata_SALES.xls to \\Home\BIData\finance\production\sales\Metadata\ALL_Metadata_SALES.xls
    when i investigated WF_Metadata_files i saw there is a global called INPUT_DIR i assume I have to change the path there. I tried to find old directory in the batch job but i cant find it and even When i give value to global variable it is still pointing to old path.
    Can anybody please help me.
    Thanks
    Tim

    Hi Tim,
    If having specified the value in the global variable it is still pointing to the old path there can be a couple of scenarios applicable
    1. There is a different global varaiable being used for the file path
    2. The filepath is hardcoded in the file-format or Excel file definition despite the declaration of the global variable.
    Are you getting this error when running a dataflow within this workflow or in a script? It will be better to run the workflow in debug mode and look through the stages to find out where exactly in the workflow it fails.
    kind regards
    Raghu

  • DATA Log file path change in sql

    Dear experts,
    due to non avaibility of space in disk i have move my SAPLog file in another drive, but now my SAP working fine but when i try to change path in SQL 2005 its not giving me permission to do.
    please let me know how to change log file path in sql 2005.
    & if infuture i change my SAP 1 DATA file path how we change configure new path for dabase  in SQL 2005.
    Regards,
    jitendra.

    Hi
    Kindly go thru below web links. before start the process stop the SAP instance.
    http://support.microsoft.com/kb/224071
    Set the log DB size - http://help.sap.com/saphelp_nwce72/helpdata/en/f2/31ad9b810c11d288ec0000e8200722/content.htm
    and ref the SAP Note 363018
    Regards
    Sriram

Maybe you are looking for

  • MPLS TE equal or unequal load balancing doesn't work?

    Dear Sir! I've two MPLS TE tunnels from one PE to another PE. And there are traffic share count between them (as tunnel mpls traffic-eng load-share command define). But in real life all traffic from the same source to the same destination go through

  • DVD Burner problem

    Hi all...i have been trying to burn a backup DVD of my photos from my desktop for days now...every time i do..it will get alllllll the way to the end of the session and then spit the DVD out and say "can not be verified"...when i look through the DVD

  • Help-Outlook Sycronization

    When I try to click on the tab to sychronize my mini with outlook contacts it freezes up and shows me the error report screen.

  • ITunes Store download not working

    I've just purchased this album from itunes http://itunes.apple.com/us/album/third-eye-blind-a-collection/id164394621 and the interactive booklet doesn't work. It's a .mov file and all i see when i try to view it is 10 seconds of black video.

  • Airport Extreme USB Disk Sharing using MobileMe - Connection Failed...

    After several hours of troubleshooting i've hit a brick wall. I have just bought a brand new Airport Extreme Base Station (Dual Band etc) and it is all set up and working lovely. I have now come to share my USB Disk using my MobileMe account. Add my