Regarding rescheduling jobs in data services management console

Hi friends
There are some issues which are leading to job failure. Is it possible to take a chance of rerun the job once again, means can we rerun the job in production for brief investigation? If Active, what will be the consequences?
1)     Will it be duplication of data?
2)     Or there any possibilities of data losingu2026u2026
Please give suggestion and  the consequences if we do ?
Thanks in advance

I would not suggest doing anything on your production data. Make change in the job as given below. Add Map operation tranform just before where you are writing to the target table. Inside map operation transform select all to discard....make sure you have selected discard for every operation codes.  In order to see what data your job is tring to write to target table. Create a template table and then join that table to tranform which is just before map operation...i...suppose you have query transform at the end and then you write to target table. Then add this query transform to map operation transform as well as to template table.
You can now get what data your job is going to write to target without writing anything to target.
Test this in dev first before trying in Production.
REMEMBER: PRODUCTION IS S SACRED SPACE AND WE DON"T WRITE UNNECESARY DATA OR DO ANY MANUPULATION IN PROD TABLES.
Thanks,

Similar Messages

  • Displaying load statistics in Data Services Management Console

    Hi,
    I am new to BO Data Services. Currently I am running on BO Data Services 3.2. I have developed a simple job which takes records from source (SRCTBL) and inserts those records into target (TGTTBL) on a SQL Server 2005 database. There is no additional processing. I have also set SRCTBL and TGTTBL both have the same column definitions. Additionally I have defined the first column MYID as a primary key. On the TGTTBL I have set the Bulk Loader Options to append mode and maximum rejects to 1.
    The job runs as expected by skipping the bad record when there is a record with duplicate MYID key which violate primary key constraint on TGTTBL. My problem is the number of records which were not loaded/skipped are not displayed in the Job Monitor Log under Data Services Management Console
    Dashboards > Operational > Job Execution History > Log Viewer
    Is there any interface in Data Management Console which shows me the following statistics for a loading operation
    - number of source records
    - number of records loaded
    - number of records skipped/discarded
    Really appreciate any pointers that you may have.
    Thanks.
    Regards,
    CS

    I would not suggest doing anything on your production data. Make change in the job as given below. Add Map operation tranform just before where you are writing to the target table. Inside map operation transform select all to discard....make sure you have selected discard for every operation codes.  In order to see what data your job is tring to write to target table. Create a template table and then join that table to tranform which is just before map operation...i...suppose you have query transform at the end and then you write to target table. Then add this query transform to map operation transform as well as to template table.
    You can now get what data your job is going to write to target without writing anything to target.
    Test this in dev first before trying in Production.
    REMEMBER: PRODUCTION IS S SACRED SPACE AND WE DON"T WRITE UNNECESARY DATA OR DO ANY MANUPULATION IN PROD TABLES.
    Thanks,

  • Data Services Management Console Error.

    Hi Experts,
    i have installed SAP Data Services 4.2 , at the time of installation "Data Services Management Console" are works fine , but now when i am tryiing to open it is giving error"web page is not available" , please guide me to troubleshoot the issue.
    Regards
    Ajit

    HI Arun Kumar
    Thanks for your reply.
    I checked the service Property and path to which it is executing it was correct and i didn't find any file call .conf in the Conf folder
    Can you please elaborate me where to check this, this is very much needed for my task.
    Thanks in advance
    Ajit

  • Data services management console deployment on weblogic

    I have installed SAP installation platform services 4.1 sp2 without Tomcat server. I have deployed BI applications using Weblogic 11 and they are working fine.
    I have installed SAP data services on top of it. Installation runs fine.
    I am not able to find Data services management console link in the start menu.
    I think its not coming in the menu as tomcat is not installed. But isn't there a way where I can deploy management console on weblogic 11.
    Please help

    HI Arun Kumar
    Thanks for your reply.
    I checked the service Property and path to which it is executing it was correct and i didn't find any file call .conf in the Conf folder
    Can you please elaborate me where to check this, this is very much needed for my task.
    Thanks in advance
    Ajit

  • What format to input $currentdate(datetime) Global Variable in Data Services Management Console?

    What is the correct input format for the datetime Global Variable in Data Services Management Console? 
    I've tried several formats and keep getting a syntax error. 
    The DSMC is Version: 14.0.3.451
    I'm a new user and learning as I go.  
    Thanks for your help. 

    Hi,
    if you get syntax error as below,
      Syntax error at line <1>: <>: near <.04> found <a float> expecting <+, ||, DIVIDE, MOD, *, SUBVARIABLE, a decimal>.
      1 error(s), 0 warning(s).
      Check and fix the syntax and retry the operation.
      Error parsing global variable values from the command line: <$sedate=2014.12.04 12:00:00;>.
    Check the syntax and try again.
    solution is simple
    Place the global variable value in single quotes '2014.12.04 12:00:00'

  • Data Services Management Console

    Hi All
    I have 2 Data Services systems on 2 seperate servers, 1 x Dev 1 x Live. I have migrated the repositories from the Dev Box onto the Live Box. I have created new job servers for the migrated repositories and changed the DataStores connections to look at the live database. I then setup the repositoires in the Management console.
    The strange thing is a few of the repositories still have a link back to the Dev job servers as well as the Live Job servers even though I have created new job servers on the Live Box for the migrated repositories and have change all connectors from Dev to Live.  How do I get rid of the references to the Dev job servers on the Live box ?
    Thanks

    Hi
    I guess that your repo migration was done by copying the databases/schemas.  If so, this is not best practice as it will take environmental information with it. 
    To remove the "old" job servers, you can remove the entries from a repository table - al_machineinfo.  However, I should warn you that editing the repo in production manually should be done by someone with VERY good Data Services knowledge.
    Good luck!
    Michael

  • Data Service Management Console (Administrator) not coming up

    Hi Experts,
    Was wondering if someone can help me a little. I am having issues with the Management console not coming up. This is what the address is:
    http://localhost:8080/DataServices
    The only message I see is : "waiting for localhost"
    I can see that the service Apache Tomcat for BI 4 is up and running.
    It was running up until yesterday but today the log-in page just does not come up. Would appreciate any help possible to look into this.
    I am also able to successfully click on Impact and lineage Analysis and navigate there. It is only the admin console that won't open.
    Just a further information, I setup a couple of real time jobs and thereafter probably something has gone wrong. Is there a way to disable RealTime jobs from outside of the management console?
    Thanks.
    MD

    Hi Manish,
    Is the path correct?
    Try http://localhost:8080/BOE/CMC

  • Service manager console can't connect to Service manager data warehouse SQL reporting services

    When I start Service manager console, it gives this kind of error:
    The Service Manager data warehouse SQL Reporting Services server is currently unavailable. You will be unable to execute reports until this server is available. Please contact your system administrator. After the server becomes available please close your
    console and re-open to view reports.
    Also in EventViewer says:
    cannot connect to SQL Reporting Services Server. Message= An unexpected error occured while connecting to SQL Reporting Services server: System.Net.WebException: The request failed with HTTP status 401: Unauthorized.
    at System.Web.Services.Protocols.SoapHttpClientProtocol.ReadResponse(SoapClientMessage message, WebResponse response, Stream responseStream, Boolean asyncCall)
    at System.Web.Services.Protocols.SoapHttpClientProtocol.Invoke(String methodName, Object[] parameters)
    at Microsoft.EnterpriseManagement.Reporting.ReportingService.ReportingService2005.FindItems(String Folder, BooleanOperatorEnum BooleanOperator, SearchCondition[] Conditions)
    at Microsoft.EnterpriseManagement.Reporting.EnterpriseReporting.FindItems(String searchPath, IList`1 criteria, Boolean And)
    at Microsoft.EnterpriseManagement.Reporting.EnterpriseReporting.FindItems(String itemPath)
    at Microsoft.EnterpriseManagement.Reporting.EnterpriseReporting.FindItem(String itemPath, ItemTypeEnum[] desiredTypes)
    at Microsoft.EnterpriseManagement.Reporting.EnterpriseReporting.GetFolder(String path)
    at Microsoft.EnterpriseManagement.Reporting.EnterpriseReportingGroup.Initialize()
    at Microsoft.EnterpriseManagement.Reporting.ServiceManagerReportingGroup..ctor(DataWarehouseManagementGroup managementGroup, String reportingServerURL, String reportsFolderPath, NetworkCredential credentials)
    at Microsoft.EnterpriseManagement.Reporting.ServiceManagerReportingGroup..ctor(DataWarehouseManagementGroup managementGroup, String reportingServerURL, String reportsFolderPath)
    at Microsoft.EnterpriseManagement.UI.SdkDataAccess.ManagementGroupServerSession.TryConnectToReportingManagementGroup() Remediation = Please contact your Administrator.
    We have a four server set-up where SCSM, SCDW, and sqls for both are on different servers. Also I have red that this could be a SPN problem, but this has  been worked on last week without the SPNs.

    On the computer you get the "SQL Reporting Services server is currently unavailable" message please open the Internet Explorer and try to connect to the URL <a href="http:///reports">http://<NameOfReportingServer>/reports
    This should open the reporting website in IE. If this isn't working you should check the proxy settings in IE. If the URL doesn't work in IE it won't work in the SCSM console as well (and vice versa).
    Andreas Baumgarten | H&D International Group
    Actually I can't access to the reporting website. It asks me credentials 3 times and then return a blank page. Also error message comes to the EventViewer System log with id 4 and source Security-Kerberos.
    The Kerberos client received a KRB_AP_ERR_MODIFIED error from the server "accountname".
    The target name used was HTTP/"reporting services fqn". This indicates that the target server failed to decrypt the ticket provided by the client.
    This can occur when the target server principal name (SPN) is registered on an account other than the account the target service is using.
    Ensure that the target SPN is only registered on the account used by the server.
    This error can also happen if the target service account password is different than what is configured on the Kerberos Key Distribution Center for that target service.
    Ensure that the service on the server and the KDC are both configured to use the same password.
    If the server name is not fully qualified, and the target domain (domain.com) is different from the client domain (domain.com), check if there are identically named server accounts in these two domains,
    or use the fully-qualified name to identify the server.
    I can access the website directly from the server which hosts Reporting Services.
    Also I query "setspn -Q HTTP/"reporting services fqn" whit result NO SUCH SPN FOUND.

  • I can't see the Reporting Workspace in Service Manager Console PLEASE HELP

    I can't see the Reporting Workspace in Service Manager Console PLEASE HELP

    Hello,
    If you don't have an entry like "SQL Server Reporting Services (InstanceName)" in Windows Service Manager (Services.msc) then because SSRS service not installed on that machine. Install it, then you will see such an entry.
    Olaf Helper
    This is not an SQL issue it's a SC Service Manager issue. Reporting Services is installed and running on the SQL side. The problem is that as I am now aware I cannot get the Data Warehouse Server to register in the Service Manager consoles to connect that
    with the canned reports and turn on Reporting Workspace. Any thoughts?
    [ Blog] [ Xing] [ MVP]

  • Remove items from the Tools menu in the Service Manager console

    Hi. 
    I know I have see a post regarding this before, but just wanted to put it out these once again.  I'd like to remove items from the Tools menu in the Service Manager console - specifically My Notifications and Create Change Request.  This is because
    the former allows a user to create notification subscriptions in incorrect Management Packs, and the latter because we are not using Change Management yet.  Does anyone have any information on how to achieve this?
    Cheers
    Shaun

    how to customize tools tht are displayed in tool menu
    http://technet.microsoft.com/en-us/library/jj134147.aspx#BKMK_tools

  • .bat or .sh file for executing jobs in data services

    are there any command line utilities where i can import the .atl files and executes the jobs in data services
    i was able to import the .atl files using 'al_engine' but not able to find the arguments to execute the jobs.
    Here is my requiremen:
    There would be 2 master scripts :
    Master_Initial_Script.bat/.sh - This would have all the pre/post checks and would call the DS Initial Job
    Master_Delta_Script.bat/.sh - This would have all the pre/post checks and would call the DS Delta Job
    Pre-Checks for Delta Job -
    If the initial job is not loaded then
                    Do not move further to execute the Delta job
    else
                    Execute the Delta job
    Post Checks for Delta Job:
    Print the statement when the job starts successfully,
    Checks the error/return code and prints Job success message in a log file

    This looks more like scheduling the job with dependency.
    Unfortunately, BODS scheduler doesnt support setting depedencies for running the jobs.
    So, the best way to go forward is export the jobs to a batch files in a particular location.
    Use the batch files with a external scheduling tool and set the dependency.
    A scheduling tool such as Control M or Redwood chronacle scheduler should be able to execute the batch files.
    Set the dependency in the scheduling tool accordingly.
    Let me know if you need any more details.
    Regards,
    Kishore Reddy L.

  • Scheduling an automatic Jobs in Data Services

    Dear Experts,
    could you please explain how to schedule an automatic job in Data Services?
    I would like to be able to schedule a daily, weekly or monthly job.
    Any Document which explain the steps?
    Thanks
    Pat

    I would not suggest doing anything on your production data. Make change in the job as given below. Add Map operation tranform just before where you are writing to the target table. Inside map operation transform select all to discard....make sure you have selected discard for every operation codes.  In order to see what data your job is tring to write to target table. Create a template table and then join that table to tranform which is just before map operation...i...suppose you have query transform at the end and then you write to target table. Then add this query transform to map operation transform as well as to template table.
    You can now get what data your job is going to write to target without writing anything to target.
    Test this in dev first before trying in Production.
    REMEMBER: PRODUCTION IS S SACRED SPACE AND WE DON"T WRITE UNNECESARY DATA OR DO ANY MANUPULATION IN PROD TABLES.
    Thanks,

  • How to install System Centre Service Manager Console on any PC

    Hi, I'm new to the forum, and am trying to get SCSM installed into the organisation for Servicedesk purposes.
    According to several pieces of documentation, I should be able to "manually install the Service Manager console as a stand-alone piece on a computer". However, I am unable to find any documentation or instructions on how to install a fat console on a PC.
    Can someone help me with the instructions.
    I understand that future versions could well include an  Analyst Webportal, which hopefully will mature the Analyst environment significantly (currently with CR support on portal only, and having to use console is far from ideal.)

    We had created a SCCM package to deploy SCSM 2010 SP1 console to user's PC. I also found someone post the following URL for SCSM 2012:
    Service Manager Console Installation via Configuration Manager
    http://blogs.technet.com/b/servicemanager/archive/2012/04/05/service-manager-console-installation-via-configuration-manager.aspx

  • Unable to logon to Oracle Web services manager console

    Hi guys...I am dan.....
    I have oracle soa 10.1.3.1.0. I ma trying to log on to Oracle Web services manager console.
    I tried all the possible usernames and passwords like
    un: default
    pw: welcome1
    un: oc4jadmin
    pw: oc4jadmin (this is the password that I gave for the administrator at the time of installation of soa suite)
    un: oc4jadmin
    pw: welcome1
    Thanks in advance guys.....

    okei, the table orawsm.users is empty is after the upgrade.
    need to reexecute irca.bat then
    but, where is the sql script for actually populating orawsm schema?
    Edited by: user11204588 on Nov 17, 2010 7:22 AM

  • Data Services Admin Console "Read Timed Out"  via Job Monitor Log Tab.

    Hi All,
    I am getting the "Read Timed Out"  via Job Monitor Log Tab issue, when large ETL jobs are being executed.
    Is there a resolution for this issue.
    Thank you
    Kind regards
    Hai.

    Hi Manoj,
    I have checked the UNIX server and the admin.xml is not located in advised location.
    The only two file that are in the advised location are:
    - ContentTypes.xml 
    - PythonAPI.xml
    I have another install of data service (default installation) on a windows environment, and I checked the advised location and the file is there.
    Is it possible to copy the windows admin.xml to the Unix server.
    Thanks
    Kind regards
    Hai.

Maybe you are looking for