COI Investment data reading

if the AFD Investment data has to be read thru 'Totals' tables does it require that all ECC entry for investment need to have the Percentage of investment, Investee fields to be filled in? Can the system read the 'ECC Investment data' with out these fields?
Is there any other way to populate these fields when reaching from ECC to BI/BCS?
Is there any other additional fields to be captured in ECC for Investment entries other than above two fields?

Hi Eugene,
I really need your advice on my AFD things. Hope you can help
Now my Location Values of AFD are as follows:
Read Investment Data From : Additional Financial Data
Read Equity Data From : Additional Financial Data
Read Equity Holdings Adjs From : Additional Financial Data
The user is only performed "Impairment Test" for the goodwill.
So I only select : "Extraordinary Amortization of Goodwill & Extraordinary Amortization of Negative Goodwill" per method.
My question is, for first consolidation, what should I put in AFD manual data entry as first step to record investment, equity & the goodwill? Will the system calculate the goodwill automatically or we need to manually entered journal entry for impairment test?
Providing that the holding company owns 73% of its subsidiary.
Thanks a bunch

Similar Messages

  • Investment data

    In the COI if the investment data has to be read from 'Totals cube'..( that is if the investment data has to be read , what ever is entered in ECC)..then does the ECC entries has to contain the field 'Activity' to inform the BCS whether it is 'First consolidaiton' or increase/decrease in capilization etc.
    My doubts is since the AFD entries clearly inform the BCS COI functions , if the data relates to what kind of activity is this..should the ECC entries also indicates the same. ( this is incase if the COI Investment data will be read only from 'Totals cube'}
    Please let me know your experience.

    Certainly, the data basis, in the investment datastream contains 'Activity' field. However, AFAIK, the system may determine the activity indirectly. -- Remember that Location of values/Investments tab? There every CoI activity might be  distinguished by the item and MoveType. So, encountering the configured here item/MT in the incoming data, the system should recognized the type of activity.

  • Cross investment data

    Hello,
    We ara trying to make a cross participation between 2 entities without success.
    Scenario:
    A has the 80% of B
    B has the 5% of A.
    In the investment data, first we create the participation of A over B as "first consolidation"
    Then when we go to create the second investment data of B over A as well as "firs consolidation" the system get us a error.
    The error  message:
    Diagnosis:
    An activity First Consolidation with activity number 1000000012 already exists for investee unit A.
    System Response
    The system terminates processing.
    Procedure
    Create the activity for a different investee unit or add activity number 1000000012 to the activity.
    Somebody can help us??

    Hi,
    First consolidation is an activity that the system performs the first time an investee is acquired in a consolidation group.
    First consolidation of an investee is a prerequisite for processing in further activities of consolidation of investments.
    So, Try to use activities "Step Acquisition" for the second investment in the same investee.
    Hope it helps.

  • Who worked with ICS' Model 4896 GPIB? I can not count the data from the module. Can prompt as it to make. It is desirable with examples (data read-out from the module and data transmission between channels. It is in advance grateful.

    I can not count the data from the module. Can prompt as it to make. It is desirable with examples (data read-out from the module and data transmission between channels. It is in advance grateful.

    Hello. Most of the engineers in developer exchange are more familiar
    with NI products. Contacting ICS for technical support is a better
    course of action.

  • SCOM 2012 R2 Reporting services installation error - data reader account

    I have successfully installed SCOM 2012 R2 and SQL 2012 SP1 on the same server. I was trying to install the reporting server piece of SCOM.
    When I get to the part that asks for a domain account, I get this error:
    "Data Reader account provided is not same as that in the management server"
    The account is not the same as the management server, and it does have read access to the data warehouse and reporting databases. Is there some other permission this account needs for reporting services?
    Thanks.

    Adding more info:
    I usually create the following account for SQL:
    AD accounts for the following to be used with SQL Services:
    SQL Agent for SQL Server Agent
    SQL Analysis Services for SQL Server Analysis
    SQL Database Engine for SQL Server Database Engine
    SQL Reporting Services for SQL Server Reporting Services
    For SCOM I create the following accounts:
    SCOM DR - Used to deploy reports
    SCOM DWR - for writing data from the management server to the reporting DW
    SCOM MSA - Performs default actions
    SCOM SDK - Used for updating/reading information from the operations Manager Database - Needs to be local administrator on the SCOM server
    We
    are trying to better understand customer views on social support experience, so your participation in this
    interview project would be greatly appreciated if you have time.
    Thanks for helping make community forums a great place.

  • Error : No data read for fiscal year 2006 (error  FDBL020)

    Hello !
    I have a problem in transaction FS10N, when i am trying do execute appear the error: No data read for fiscal year 2006 (error  FDBL020), i tried applying the note 302263 (deleting the attributes of programs)but the attributes has been deleted. Anybody know a solution ?
    Thank´s
    Claudenir

    hi,
    i mean to check, whether there are items posted in that fiscal year.
    Andreas

  • No data read for fiscal year 2011

    In ECC6, after posting document in FI, I wan to check balances with T-Code FS10N and Enter GL account in which document entry supposed to be recorded. But when I execute it getting message in Information window " NO data read for fiscal year 2011 (Long text)
    what does it mean? Appreciated explanation

    I did saved the entry and then got message, when I was trying to display document no?
    error message "document no 200000003 sony does not exists in fiscal year 2011"
    when I was trying to see display of GL account
    Got message " NO data read for fiscal year 2011"
    when I was trying to make document reversal of document no: 200000003
    Got message "document no 200000003 sony does not exists in fiscal year 2011"
    My document type is SA and reversal document type is Ab, No. range for this document is in fiscal year 2011.
    I really confused where is my config. went wrong?
    suggestion and expert opinion are expected and appreciated.

  • Reporting Install fails - Data Reader account is not the same as in management group

    We recently upgraded from SCOM 2012 SP1 to R2, but I'm unable to get the reporting role installed. I get to the part where I enter the Data Reader account credentials, then get the error "Data Reader account provided is not same as that in the management
    group." No matter what I do I cannot get passed this error.
    Here is what I have tried so far:
    - I have verified the correct DR account is set in the Data Warehouse Report deployment account and profile.
    - I have deleted the DR account from the profile and added it back.
    - Verified all permissions are setup properly on the SQL server for all databases, included reporting.
    - Completely uninstalled SSRS and re-installed.

    you must distribute the Run As account to ensure that all members of the resource pool (All Management Servers Pools) have access to the permissions in the Run As account.
    Also check below link {It's same issue}
    http://social.technet.microsoft.com/Forums/systemcenter/en-US/11e25eac-fbde-4ed5-aff5-4faa0563cfbc/scom-2012-reporting-installation-issue?forum=operationsmanagerdeployment
    Please remember, if you see a post that helped you please click "Vote As Helpful" and if it answered your question, please click "Mark As Answer"
    Mai Ali | My blog: Technical | Twitter:
    Mai Ali

  • "Master Data Read Class Parameters" field is disabled

    Hi.
    While creating infoobject I switch master data access from Default to Own implementation,
    after this screen is redisplayed  and only Name of Master Data Read Class field become enabled.
    I need  Master Data Read Class Parameters field, but it is disabled and button near this field has the same status.
    Why? How to make them active?

    Hi,
    I think it is not possible to add own Master Data Class if the reference is 0DATE or if the type is DATS.
    So may be you can create your infoobject as Type CHAR, Length 8.
    May be you can add a Conversion Routine for the required format of date to be displayed.

  • Weblogic 11G error = BEA-000449  Closing socket as no data read from it

    In My Weblogic 11G, i am getting Warning msg in my log file saying Closing socket as no data read from it
    ####<Nov 2, 2010 12:10:53 AM IST> <Warning> <Socket> <TradeServer> <TradeServer> <[ACTIVE] ExecuteThread: '8' for queue: 'weblogic.kernel.Default (self-tuning)'> <<WLS Kernel>> <1288636853607> <BEA-000449> <Closing socket as no data read from it on 95.66.7.15:58,089 during the configured idle timeout of 25 secs>
    ####<Nov 2, 2010 12:10:53 AM IST> <Warning> <Socket> <TradeServer> <TradeServer> <[ACTIVE] ExecuteThread: '8' for queue: 'weblogic.kernel.Default (self-tuning)'> <<WLS Kernel>> <1288636853607> <BEA-000449> <Closing socket as no data read from it on 95.66.7.15:58,088 during the configured idle timeout of 25 secs>
    ####<Nov 2, 2010 12:21:37 AM IST> <Info> <JDBC> <TradeServer> <TradeServer> <[ACTIVE] ExecuteThread: '23' for queue: 'weblogic.kernel.Default (self-tuning)'> <<WLS Kernel>> <1288637497701> <BEA-001128> <Connection for pool "IB_JDBC_Data_Source" closed.>
    I have follow the following step
    If you want to follow this solution. Go to Admin console -> Click on Domain->Configuration->Log Filters->create new Log filter.
    I added this line in expression "(MESSAGE !='Closing socket as no data read from it during the configured idle timeout of 5 secs')"
    Go to your server-(for each server you have to set it individually)->Logging->Advanced->Select this log filter for Standard Out or log file.
    My Filter is "(MESSAGE !=Closing socket as no data read from it ')"
    this is not working in weblogic 11G, any one have the solution to stop this msg.
    Edited by: Amar_Shaw on Nov 3, 2010 1:40 PM

    Hi Amar,
    I think you have given the wrong string in the filter, you are getting "*Closing socket as no data read from it on 95.66.7.15:58,089 during the configured idle timeout of 25 secs*" and you have given in the filter "*Closing socket as no data read from it during the configured idle timeout of 5 secs*".
    You can change it and see if that works for you.
    Also the above option is just to suppress the issue which are getting which in this case is fine as its just a warning message, however you can even try to tune few of the follwoing parameters that too would help you to remove this warning message.
    1. Set the parameter -Dweblogic.client.socket.ConnectTimeout=XXX, in the start-up script of the server which you are seeing this issue under JAVA_OPTIONS
    Note: Where "XXX" is the value in ms.
    Example:
    -Dweblogic.client.socket.ConnectTimeout=500
    2. Try tuning the duration time to a higher value from the below Console path
    Server -> Protocols (tab) -> HTTP (sub-tab) -> Duration
    Regards,
    Ravish Mody
    http://middlewaremagic.com/weblogic/
    Middleware Magic | Come, Join Us and Experience The Magic…

  • Data read from undo

    hi,
    how to find the data read from undo tablespace.
    how to find the data read from datafile.
    any select statement read from redo log or not. i think no,is it correct
    thanks
    with regards

    user3266490 wrote:
    hi,
    thanks for reply.
    What does it mean by "how to find data read" ?
    that means how  to find the a select statement  whether read from data buffer cache or data file*
    even if it is read from data file.first kept in buffer then return to userYes , data is always going to be read from teh buffer cache only , even if its going to be a physical read too.
    In case you want to see that there was a PIO involved or logical IO( from teh cache), you can check so by seeing the stats for the query like below
    SQL*Plus: Release 10.2.0.1.0 - Production on Thu May 21 11:59:39 2009
    Copyright (c) 1982, 2005, Oracle.  All rights reserved.
    Connected to an idle instance.
    SQL> startup
    ORACLE instance started.
    Total System Global Area  167772160 bytes
    Fixed Size                  1247900 bytes
    Variable Size              75498852 bytes
    Database Buffers           88080384 bytes
    Redo Buffers                2945024 bytes
    Database mounted.
    Database opened.
    SQL> conn aman/aman
    Connected.
    SQL> set autot trace stat
    SQL> select * from scott.emp;
    14 rows selected.
    Statistics
            455  recursive calls
              0  db block gets
             83  consistent gets
             10  physical reads                      <----------------------- This went to disk first to read the data
              0  redo size
           1415  bytes sent via SQL*Net to client
            381  bytes received via SQL*Net from client
              2  SQL*Net roundtrips to/from client
              6  sorts (memory)
              0  sorts (disk)
             14  rows processed
    SQL> select * from scott.emp;
    14 rows selected.
    Statistics
              0  recursive calls
              0  db block gets
              8  consistent gets
              0  physical reads                    <---------------No PIO, which means it was accessed truly from the cache and didn't involve disk IO at all.
              0  redo size
           1415  bytes sent via SQL*Net to client
            381  bytes received via SQL*Net from client
              2  SQL*Net roundtrips to/from client
              0  sorts (memory)
              0  sorts (disk)
             14  rows processed
    SQL>HTH
    Aman....
    Edited by: Aman.... on May 21, 2009 11:58 AM

  • BEA-000449  Closing socket as no data read from it

    This error message is filling up the server log files. It looks like some network problem. How can I find the cause ? This does not create a functional problem, but I am sure there will be a performance problem. Also, since this message is filling up the logs rapidly, I cannot see my regular application debug statements.
    <Warning> <Socket> <myserver.mydomain.net> <my_managedserver_01> <[ACTIVE] ExecuteThread: '5' for queue: 'weblogic.kernel.Default (self-tuning)'> <<WLS Kernel>> <> <> <1218554019557> <BEA-000449> <Closing socket as no data read from it during the configured idle timeout of 30 secs>
    The timeout of 30 seconds is the value set in Login Timeout in server tuning tab.
    Environment:
    WebLogic Portal 10.0 MP1 (The domain is a server domain, not portal domain)
    Red Hat linux 4
    Intel Xeon
    Message was edited by:
    prakashp

    In My Weblogic 11G, i am getting Warning msg in my log file saying Closing socket as no data read from it
    ####<Nov 2, 2010 12:10:53 AM IST> <Warning> <Socket> <TradeServer> <TradeServer> <[ACTIVE] ExecuteThread: '8' for queue: 'weblogic.kernel.Default (self-tuning)'> <<WLS Kernel>> <> <> <1288636853607> <BEA-000449> <Closing socket as no data read from it on 95.66.7.15:58,089 during the configured idle timeout of 25 secs>
    ####<Nov 2, 2010 12:10:53 AM IST> <Warning> <Socket> <TradeServer> <TradeServer> <[ACTIVE] ExecuteThread: '8' for queue: 'weblogic.kernel.Default (self-tuning)'> <<WLS Kernel>> <> <> <1288636853607> <BEA-000449> <Closing socket as no data read from it on 95.66.7.15:58,088 during the configured idle timeout of 25 secs>
    ####<Nov 2, 2010 12:21:37 AM IST> <Info> <JDBC> <TradeServer> <TradeServer> <[ACTIVE] ExecuteThread: '23' for queue: 'weblogic.kernel.Default (self-tuning)'> <<WLS Kernel>> <> <> <1288637497701> <BEA-001128> <Connection for pool "IB_JDBC_Data_Source" closed.>
    I have follow the following step
    If you want to follow this solution. Go to Admin console -> Click on Domain->Configuration->Log Filters->create new Log filter.
    I added this line in expression "(MESSAGE !='Closing socket as no data read from it during the configured idle timeout of 5 secs')"
    Go to your server-(for each server you have to set it individually)->Logging->Advanced->Select this log filter for Standard Out or log file.
    this is not working in weblogic 11G, any one have the solution to stop this msg.

  • Closing socket as no data read

    Hi,
    I am using weblogic 10.3 application server. I am frequently facing the below warning in log file
    <BEA-000449> <Closing socket as no data read from it during the configured idle timeout of *0 secs*>
    As a response, "Page cannot be displayed" error is seen in browser for that request. I checked the config.xml, nowhere it is configured as 0 seconds.
    Please help me with your suggestions.
    Regards
    Purushoth
    Edited by: user13299431 on Nov 29, 2010 2:01 AM

    [Refer to this link|http://forums.oracle.com/forums/thread.jspa?messageID=9114422]

  • CLOB Data Read

    Hi,
    We have a table in our oracle database having CLOB datatype as column.This column gets populated from Java application wherein the application reads an excel file from front end into CLOB column.
    My requirement is to read this CLOB column from database and insert data into some file(CSV Format). say if excel was read into CLOB with 3 column and 10 rows, then I should also be able to read this column and insert into some file(CSV Format) that has same structure as of excel file.
    For this i call the JavaConcurrentProgram.
    Can anyone tell me what will i do for this? Any help would be appreciated.
    Thanks in advance

    Hi chandra,
    Thanks for ur reply.
    I create the EO and VO for file uploading.
    I want to upload the file into data base when i clicked on the submit button.So i write the row creation code in PFR.
    Actually my requirement is: whenever we upload the file into database(CLOB cloumn) by using the javaconcurrentprogram we convert the clob data content is stroed in .dat file certain path.
    for this i write the code like this:
    Actuall class+:*
    public void processFormRequest(OAPageContext pageContext, OAWebBean webBean)
    super.processFormRequest(pageContext, webBean);
    OAApplicationModule am = pageContext.getApplicationModule(webBean);
    if (pageContext.getParameter("go") != null)
    { processFileUpload(pageContext, webBean, am);
    protected void processFileUpload(OAPageContext pageContext, OAWebBean webBean, OAApplicationModule am)
    // Get hold of the binary fle contents
    DataObject fleUploadData = (DataObject)pageContext.getNamedDataObject("fileupload");
    if (fleUploadData == null) return;
    String fleName = (String)fleUploadData.selectValue(null,"UPLOAD_FILE_NAME");
    if (fleName == null) return;
    String contentType =(String)fleUploadData.selectValue(null,"UPLOAD_FILE_MIME_TYPE");
    BlobDomain uploadedByteStream = (BlobDomain)fleUploadData.selectValue(null,fleName);
    if (uploadedByteStream == null) return;
    // convert binary fle contents into an ASCII stream
    System.out.println("after calling if block");
    try {
    String inputStream = streamToString(contentType, uploadedByteStream.getInputStream() );
    String purposeCode = null;
    System.out.println("after calling if block");
    System.out.println("store the values in table");
    // Store ASCII in the fle in the database
    primaryKey = storeStream(inputStream , purposeCode, am);
    System.out.println("after calling if block");
    System.out.println("the orgid value iss "+(am.getOADBTransaction()).getOrgId());
    int orgId = ((OADBTransactionImpl)am.getOADBTransaction()).getOrgId();
    // submit the printing request
    submitConcurrentProgram(primaryKey, purposeCode, orgId, am.getOADBTransaction().getJdbcConnection());
    } catch (IOException ex) {
    throw OAException.wrapperException(ex);
    } catch (SQLException ex) {
    throw OAException.wrapperException(ex);
    } catch (RequestSubmissionException ex) {
    throw OAException.wrapperException(ex);
    protected String streamToString(String mimeType, InputStream inputStream) throws IOException
    String result;
    System.out.println("mime type is "+mimeType);
    // check if this is an Excel spreadsheet
    if ("application/vnd.ms-excel".equalsIgnoreCase(mimeType))
    try {
    result = xlsToString(inputStream);
    } catch (jxl.read.biff.BiffException ex)
    { // if not, then assume this is an ASCII stream
    System.out.println("catch blockk");
    inputStream.reset();
    result = inputStream.toString();//streamToString(mimeType,inputStream); ////streamToString("ASCII",inputStream);
    } else
    { // otherwise an ASCII stream
    result =inputStream.toString();//streamToString(mimeType,inputStream); ////streamToString("ASCII",inputStream);
    return result;
    private final static String CSV_SEPARATOR = ",";
    private String xlsToString(InputStream stream)
    throws jxl.read.biff.BiffException
    StringWriter stringWriter = new StringWriter();
    BufferedWriter bufferedWriter = new BufferedWriter(stringWriter);
    try {
    WorkbookSettings ws = new WorkbookSettings();
    ws.setLocale(new Locale("en", "EN"));
    Workbook w = Workbook.getWorkbook(stream, ws);
    // Gets the sheets from workbook
    for (int sheet = 0; sheet < w.getNumberOfSheets(); sheet++)
    Sheet s = w.getSheet(sheet);
    Cell[] row = null;
    // Gets the cells from sheet
    for (int i = 0 ; i < s.getRows() ; i++)
    row = s.getRow(i);
    if (row.length > 0)
    bufferedWriter.write(formatExcelCell(row[0]));
    for (int j = 1; j < row.length; j++)
    bufferedWriter.write(CSV_SEPARATOR);
    bufferedWriter.write(formatExcelCell(row[j]));
    bufferedWriter.newLine();
    bufferedWriter.flush();
    catch(jxl.read.biff.BiffException ex)
    throw ex;
    catch (Exception ex)
    throw OAException.wrapperException(ex);
    return stringWriter.toString();
    private final static String QUOTE_STRING ="\"";
    private String formatExcelCell(Cell cell)
    SimpleDateFormat dateFormatter = new SimpleDateFormat("d-MMM-yyyy");
    if (cell == null) return null;
    String rowCell = cell.getContents();
    // format the date
    if ( cell.getType().equals(cell.getType().DATE) )
    rowCell = dateFormatter.format(((DateCell)cell).getDate());
    // double each quote
    String result = rowCell.replaceAll(QUOTE_STRING, QUOTE_STRING+QUOTE_STRING);
    // surround with quotes if comma or quote is present
    if (result.indexOf(CSV_SEPARATOR) >0 || result.indexOf(QUOTE_STRING) >0)
    result = QUOTE_STRING + result + QUOTE_STRING;
    return result;
    protected Number generatePrimaryKey(OAApplicationModule am)
    OAViewObject viewObject = (OAViewObject)am.findViewObject("FileUploadKeyVO1");
    viewObject.setMaxFetchSize(1);
    viewObject.executeQuery();
    return (Number) (viewObject.first().getAttribute("Id"));
    *public Number storeStream*(String inputStream, String purposeCode,OAApplicationModule am)
    OAViewObject viewObject = (OAViewObject)am.findViewObject("xxcrmFileUploadVO1");
    viewObject.setMaxFetchSize(0);
    ClobDomain myClob = new ClobDomain();
    myClob.setChars(inputStream.toCharArray());
    OARow row = (OARow)viewObject.createRow();
    row.setAttribute("FileContents",myClob);
    primaryKey = generatePrimaryKey(am);
    row.setAttribute("PurposeCode", purposeCode);
    row.setAttribute("FileId", primaryKey);
    viewObject.insertRow(row);
    am.getTransaction().commit();
    return primaryKey;
    public void runProgram(CpContext cpcontext) {
    try {
    // read parameters
    // execute business logic
    cpcontext.getReqCompletion().setCompletion(ReqCompletion.NORMAL, "Request Completed Normal");
    } catch (Exception ex) {
    // report exception
    cpcontext.getReqCompletion().setCompletion(ReqCompletion.ERROR, "Error building output fle");
    } finally {
    cpcontext.releaseJDBCConnection();
    static public Map convertParameters(ParameterList parameterList)
    { Map result = new HashMap();
    while( parameterList.hasMoreElements() ) {
    NameValueType nameValueType = parameterList.nextParameter();
    if (nameValueType.getValue() != null)
    result.put(nameValueType.getName(), nameValueType.getValue());
    return result;
    protected void submitConcurrentProgram(Number primaryKey, String purposeCode, int orgId,
    Connection connection) throws IOException, SQLException, RequestSubmissionException
    System.out.println("before getting the output");
    System.out.println("after getting the output");
    ConcurrentRequest request = new ConcurrentRequest(connection);
    Vector param = new Vector();
    param.add(primaryKey.stringValue());
    param.add(purposeCode);
    param.add(String.valueOf(orgId));
    int reqId = request.submitRequest("XXNC", "XXNC_FILEWRITE_JAVA", "Print CLOB",null, false, param);
    connection.commit();
    MessageToken[] tokens = { new MessageToken("REQUEST", String.valueOf(reqId)) };
    OAException confrmMessage = new OAException("XXNC",
    "XXNC_TEST_MSG",
    tokens,
    OAException.CONFIRMATION,
    null);
    throw confrmMessage;
    _+*JavaconcurrentProgram class:*+_
    /* for getting the clob data content stored into certain path*/
    public String getOutput(Connection connection,
    BigDecimal primaryKey) throws SQLException,
    IOException
    String statement =
    "select file_contents from xxcrm_file_upload where file_id = :1";
    String result;
    PreparedStatement stmt = null;
    ResultSet resultSet = null;
    try
    stmt = connection.prepareStatement(statement);
    stmt.setBigDecimal(1, primaryKey);
    resultSet = stmt.executeQuery();
    resultSet.next();
    result = //streamToString("application/vnd.ms-excel",resultSet.getAsciiStream(1));
    resultSet.getAsciiStream(1).toString(); //streamToString("ASCII",resultSet.getAsciiStream(1) );
    File data = new File("/u01/VIS/apps/apps_st/appl/xxnc/12.0.0/bin/TestClob.dat");
    Reader reader = resultSet.getCharacterStream(1);
    FileWriter writer = new FileWriter(data);
    char[] buffer = new char[1];
    while (reader.read(buffer) > 0) {
    writer.write(buffer);
    writer.close();
    resultSet.close();
    resultSet = null;
    stmt.close();
    stmt = null;
    } finally
    if (resultSet != null)
    try
    resultSet.close();
    } catch (SQLException ex)
    if (stmt != null)
    try
    stmt.close();
    } catch (SQLException ex)
    try
    connection.commit();
    } catch (SQLException ex)
    return result;
    public void runProgram(CpContext cpcontext)
    Number primarykey = 0;
    String s = cpcontext.getProfileStore().getProfile("APPS_FRAMEWORK_AGENT");
    if (!s.endsWith("/"))
    s = (new StringBuilder()).append(s).append("/").toString();
    s = (new StringBuilder()).append(s).append("OA_MEDIA").toString();
    Connection con = cpcontext.getJDBCConnection();
    ReqCompletion lRC = cpcontext.getReqCompletion();
    try
    // Connection con = DriverManager.getConnection("jdbc:oracle:thin:@erpdb.nalsoft.net:1521:VIS","apps","visdb_234");
    PreparedStatement st =
    con.prepareStatement("select file_id from xxcrm_file_upload order by file_id desc");
    ResultSet rs = st.executeQuery();
    OutFile lOF = cpcontext.getOutFile();
    LogFile lLF = cpcontext.getLogFile();
    if (rs.next())
    primarykey = rs.getBigDecimal(1);
    BigDecimal primarybig = new BigDecimal(primarykey.toString());
    s = getOutput(con, primarybig);
    lOF.write(s);
    lRC.setCompletion(ReqCompletion.NORMAL, "Request Completed Normal");
    } catch (Exception e)
    { //Catch exception if any
    System.err.println("Error: " + e.getMessage());
    lRC.setCompletion(ReqCompletion.ERROR, e.toString());
    } finally
    cpcontext.releaseJDBCConnection();
    Please give me the suggestion.

  • Verify Data Reader Account

    Is there a way to verify which account SCOM thinks is the data reader account?  I'm having an issue installing SCOM Reporting as it's telling me the account I'm providing during the install is not the same account that's specified in the management
    group.  Could this be verified somewhere in a SQL table?

    Hi There,
    Data Warehouse Action Account - is for the Data writer account. 
    Data Warehouse Report Deployment Account - is for the Data reader account.
    So i would suggest you double click the Data Warehouse Report Deployment Account and see what account
    is present there.
    And go to SSMS and verify if the below permissions are given for correct function.
    Refer for permissions with screenshots.
    https://social.technet.microsoft.com/Forums/en-US/4b58dcb1-7138-4bb1-a26c-9f5356113b11/scom-2012-r2-availability-report-errors?forum=operationsmanagerdeployment
    Gautam.75801

Maybe you are looking for

  • Database startup after reboot of RAC server

    Hello, My config : 2 nodes w2k3 with 15 dbs . Oracle 10.2.0.3 I started to scheduled my rac server reboot so i follow the oracle doc to shutdown properly all db,asm,service,listener,etc... Today one the server reboot, but some instance doesn't start

  • New 160 GB iPod messes up videos!

    OK. I recently had to buy a new computer because my old one was well...old. I updated from Compaq to HP which means I went from Windows XP to Windows 7. I still have a 30 GB iPod which plays my videos wonderfully. No problems at all. Then I bought a

  • How to find average item price

    How can i find the average item price of an item throug quey or application

  • Is iWeb Good for Blogging, compared to free software?

    I'm trying to decide if it's worthwhile to continue using iWeb for my blog. It's a little cumbersome, it's hard to set up sitemaps and increase traffic, and it doesn't always work very well. Any honest opinions comparing it to Blogger, WordPress, and

  • Lightroom 3 et Photoshop Element 8

    Bonjour, Je viens d'installer dans  mon ordinateur Lightroom 3. J'avais déjà d'installé dans mon ordinateur Photoshop Element 8. Quand je veux transférer des photos de ma caméra Canon T1i vers Lightroom 3, c'est la fenêtre "Téchargeur de Organiser" d