Help needed setting specific date in GregorianCalendar object?

Hi All!
I have set SimpleTimeZone object in GregorianCalendar object. How can I set a new date to the GregorianCalendar object, such that i can find DAY_OF_YEAR specific to this date's day?
URGENT replies will be appreciated!

Thanks for ur reply!
but the problem is that i have to create a method:
public int getDayOfYear(int day, int month, int year)
return day_of_the year;
This method should accept user specified day, month and year and passes to GregorianCalendar/Calendar object. I need want to get DAY_OF_YEAR value from Calendar.DAY_OF_YEAR which it will provide, but the change is not affecting!
ACTUAL CODE
===========
public static double getDayofYear(int hr, int min, int sec)
String[] ids = TimeZone.getAvailableIDs(5 * 60 * 60 * 1000);
SimpleTimeZone pdt = new SimpleTimeZone(5 * 60 * 60 * 1000, ids[0]);      
for (int i=0; i<ids.length; i++)
System.out.println (ids);
//setting DLS start and end rule
pdt.setStartRule(Calendar.MAY, 1, Calendar.SUNDAY, 00 * 60 * 60 * 1000);
pdt.setEndRule(Calendar.OCTOBER, -1, Calendar.SUNDAY, 00 * 60 * 60 * 1000);
Calendar calendar = new GregorianCalendar(pdt);
//Date trialTime = new Date();//SET DATE
//calendar.setTime(trialTime);
calendar.set(Calendar.HOUR,hr);
calendar.set(Calendar.MINUTE,min);
calendar.set(Calendar.SECOND,sec);
return ((double) calendar.get(Calendar.DAY_OF_YEAR));          
This method is giving true output for today [2nd Dec. 2002 -> 336], but only when it accepts current time, not date. Also giving same output on changing arguments!
Hoping ur help!

Similar Messages

  • Needed SAP CRM Data model with Object, Entity and Attribute level details

    Hello all,
                 We are working on a huge IS-U / CRM implementation and we are still in the data gathering phase. The client has a whole load of legacy systems that will be replaced with IS-U and CRM. Right now we are in the process of developing data models using Excel first and then presenting them to the client to go forward from there. For this we need to have all the business objects, entities and their attributes.
    I know about the SD11 transaction, but we don't have a CRM system yet. My colleagues have access to a German ERP system and they were able to get models for HR, FI and Asset management. I tried for the Business partner / customer in there, but the models were not proper.
    So, once again, I need the specific data models out of SD 11 for  CRM business partner. If anybody has the information, please do pass it on to me as I need them urgently. It would be a great help if somebody can do so.
    Regards
    Rajesh

    I suggest the following:
    Please, check whether the system works if you activate the implementation BUPA_F4_AUGRP.
    In addition check the notes 559662, 674869 and 782927. Maybe the notes are already implemented but you can try then the implementation of the BADI (SE19). It should resolve your issue.
    I have implemented this Badi solution before, and after activation; the search help ; nor search result list did NOT show any Business partners anymore that had an authorization group I was not allowed to see.
    kind regards
    Davy Pelssers
    SAP CRM/Security consultant

  • Need Help to Transfer Specific Data with dml_condition

    i have two databases on two different server with name db1 and db2. i want to transfer specific data with dml condition. bellow is my code
    for Server have db1.
    --------------Sys----------------------
    create user strmadmin identified by strmadmin;
    grant connect, resource, dba to strmadmin;
    begin dbms_streams_auth.grant_admin_privilege
    (grantee => 'strmadmin',
    grant_privileges => true);
    end;
    grant select_catalog_role, select any dictionary to strmadmin;
    alter system set global_names=true;
    alter system set streams_pool_size = 100 m;
    ----------------------end--------------------
    -----------------------StrmAdmin--------------------------
    create database link db2
    connect to strmadmin
    identified by strmadmin
    using 'DB2';
    EXEC DBMS_STREAMS_ADM.SET_UP_QUEUE();
    EXEC DBMS_CAPTURE_ADM.PREPARE_TABLE_INSTANTIATION(table_name => scott.emp');
    -- Configure capture process at the source database
    begin dbms_streams_adm.add_table_rules
    ( table_name => 'scott.emp',
    streams_type => 'capture',
    streams_name => 'capture_stream',
    queue_name=> 'strmadmin.streams_queue',
    include_dml => true,
    include_ddl => true,
    inclusion_rule => true);
    end;
    -- -- Configure Sub Set Rules capture process at the source database
    begin dbms_streams_adm.add_subset_rules
    ( table_name => 'scott.emp',
    dml_condition=>'deptno=50',
    streams_type => 'capture',
    streams_name => 'capture_stream',
    queue_name=> 'strmadmin.streams_queue',
    include_tagged_lcr => true);
    end;
    --     Configure the propagation process at Sources Database
    begin dbms_streams_adm.add_table_propagation_rules
    ( table_name => 'scott.emp',
    streams_name => 'DB1_TO_DB2',
    source_queue_name => 'strmadmin.streams_queue',
    destination_queue_name => 'strmadmin.streams_queue@DB2',
    include_dml => true,
    include_ddl => true,
    source_database => 'DB1',
    inclusion_rule => true);
    end;
    --     Configure the Subset propagation Rule process at Sources Database
    begin SYS.dbms_streams_adm.add_subset_propagation_rules
    ( table_name => 'scott.emp',
    dml_condition=>'deptno=50',
    streams_name => 'DB1_TO_DB2',
    source_queue_name => 'strmadmin.streams_queue',
    destination_queue_name => 'strmadmin.streams_queue@DB2',
    include_tagged_lcr => true);
    end;
    --      Set the instantiation system change number (SCN)
    declare
    source_scn number;
    begin
    source_scn := dbms_flashback.get_system_change_number();
    dbms_apply_adm.set_table_instantiation_scn@DB2
    ( source_object_name => 'scott.emp',
    source_database_name => 'DB1',
    instantiation_scn => source_scn);
    end;
    --      Start the capture processes
    begin dbms_capture_adm.start_capture
    ( capture_name => 'capture_stream');
    end;
    ---------------------------End----------------------------------------------------------
    for server 2 have db2.
    --------------------------Sys---------------------------------------------------------------
    CREATE USER strmadmin IDENTIFIED BY strmadmin;
    GRANT CONNECT, RESOURCE, DBA TO strmadmin;
    BEGIN
    DBMS_STREAMS_AUTH.grant_admin_privilege (grantee => 'strmadmin',
    grant_privileges => TRUE);
    END;
    GRANT SELECT_CATALOG_ROLE, SELECT ANY DICTIONARY TO strmadmin;
    ALTER SYSTEM SET global_names=TRUE;
    ALTER SYSTEM SET streams_pool_size = 100 M;
    -----------------------------------------------------------End-----------------------------
    ---------------------------------Stream user--------------------------------------------------------------
    CREATE DATABASE LINK db1
    CONNECT TO strmadmin
    IDENTIFIED BY strmadmin
    USING 'DB1';
    EXEC DBMS_STREAMS_ADM.SET_UP_QUEUE();
    EXEC DBMS_CAPTURE_ADM.PREPARE_TABLE_INSTANTIATION(table_name => scott.emp');
    -- add table Level rule on target Database.
    BEGIN
    DBMS_STREAMS_ADM.add_table_rules (
    table_name => 'scott.emp',
    streams_type => 'apply',
    streams_name => 'apply_stream',
    queue_name => 'strmadmin.streams_queue',
    include_dml => TRUE,
    include_ddl => TRUE,
    source_database => 'DB1',
    inclusion_rule => TRUE);
    END;
    -- add table Level Sub Set rule on target Database.
    BEGIN
    DBMS_STREAMS_ADM.add_subset_rules (
    table_name => 'scott.emp',
    dml_condition => 'deptno=50',
    streams_type => 'apply',
    streams_name => 'apply_stream',
    queue_name => 'strmadmin.streams_queue',
    include_tagged_lcr => TRUE);
    END;
    -- Start the apply processes
    BEGIN
    DBMS_APPLY_ADM.set_parameter (apply_name => 'apply_stream',
    parameter => 'disable_on_error',
    VALUE => 'n');
    END;
    BEGIN
    DBMS_APPLY_ADM.start_apply (apply_name => 'apply_stream');
    END;
    ---------------------------------End---------------------------------------------------------------------------------
    plz help me.

    below is the Result.
    RULE_NAME RULE_TYPE RULE_SET_TYPE RULE_SET_NAME STREAMS_TYPE STREAMS_NAME
    RULE$_7 POSITIVE RULESET$_8 DEQUEUE SCHEDULER_PICKUP
    RULE$_11 POSITIVE RULESET$_8 DEQUEUE SCHEDULER_PICKUP
    RULE$_3 POSITIVE RULESET$_4 DEQUEUE SCHEDULER_COORDINATOR
    EMP122 DDL POSITIVE RULESET$_123 PROPAGATION DB1_TO_DB2
    EMP121 DML POSITIVE RULESET$_123 PROPAGATION DB1_TO_DB2
    EMP124 DML POSITIVE RULESET$_123 PROPAGATION DB1_TO_DB2
    EMP125 DML POSITIVE RULESET$_123 PROPAGATION DB1_TO_DB2
    EMP126 DML POSITIVE RULESET$_123 PROPAGATION DB1_TO_DB2
    EMP115 DML POSITIVE RULESET$_117 CAPTURE CAPTURE_STREAM
    EMP116 DDL POSITIVE RULESET$_117 CAPTURE CAPTURE_STREAM
    EMP118 DML POSITIVE RULESET$_117 CAPTURE CAPTURE_STREAM
    EMP119 DML POSITIVE RULESET$_117 CAPTURE CAPTURE_STREAM
    EMP120 DML POSITIVE RULESET$_117 CAPTURE CAPTURE_STREAM
    Edited by: Naeem Ullah Khattak on Apr 19, 2013 2:57 AM

  • Help needed on meta data export

    I need a meta data export dump of a source database of two schemas and import to a fresh new database.
    COuld you please help with the steps to be followed.

    Assuming you are using 10g and exp utility
    You can say rows=N
    exp help=yes
    Export: Release 10.2.0.1.0 - Production on Thu Jul 23 13:36:59 2009
    Copyright (c) 1982, 2005, Oracle. All rights reserved.
    You can let Export prompt you for parameters by entering the EXP
    command followed by your username/password:
    Example: EXP SCOTT/TIGER
    Or, you can control how Export runs by entering the EXP command followed
    by various arguments. To specify parameters, you use keywords:
    Format: EXP KEYWORD=value or KEYWORD=(value1,value2,...,valueN)
    Example: EXP SCOTT/TIGER GRANTS=Y TABLES=(EMP,DEPT,MGR)
    or TABLES=(T1:P1,T1:P2), if T1 is partitioned table
    USERID must be the first parameter on the command line.
    Keyword Description (Default) Keyword Description (Default)
    USERID username/password FULL export entire file (N)
    BUFFER size of data buffer OWNER list of owner usernames
    FILE output files (EXPDAT.DMP) TABLES list of table names
    COMPRESS import into one extent (Y) RECORDLENGTH length of IO record
    GRANTS export grants (Y) INCTYPE incremental export type
    INDEXES export indexes (Y) RECORD track incr. export (Y)
    DIRECT direct path (N) TRIGGERS export triggers (Y)
    LOG log file of screen output STATISTICS analyze objects (ESTIMATE)
    {color:red}*ROWS export data rows (Y)* {color} PARFILE parameter filename
    CONSISTENT cross-table consistency(N) CONSTRAINTS export constraints (Y)
    OBJECT_CONSISTENT transaction set to read only during object export (N)
    FEEDBACK display progress every x rows (0)
    FILESIZE maximum size of each dump file
    FLASHBACK_SCN SCN used to set session snapshot back to
    FLASHBACK_TIME time used to get the SCN closest to the specified time
    QUERY select clause used to export a subset of a table
    RESUMABLE suspend when a space related error is encountered(N)
    RESUMABLE_NAME text string used to identify resumable statement
    RESUMABLE_TIMEOUT wait time for RESUMABLE
    TTS_FULL_CHECK perform full or partial dependency check for TTS
    TABLESPACES list of tablespaces to export
    TRANSPORT_TABLESPACE export transportable tablespace metadata (N)
    TEMPLATE template name which invokes iAS mode export
    Export terminated successfully without warnings.

  • Help needed setting up an accelerometer via serial port

    Hoping you might be able to give me some advice and point me in the right direction as my experience with labview is limited.
    I am wanting to read data from an accelerometer streaming through a serial port. Using the labview examples i have no problems in connecting to the device and seeing the data but i am going around in circles trying to work out where to from now.  
    I need to create a VI which would show data streaming and then gives the user the ability to save data for various time periods to a text file. I am getting stuck as the data from the accelerometer is constantly streaming in 20byte packets every 2ms in the following format: 7E 00 0E 83 00 00 26 00 01 0E 00 02 00 01 F9 02 64 E5, where 00 02 01 F9  02 64 is the data of interest and of course when starting to read from the device it isnt necessary going to begin at the start of the packet.
    I am assuming that inside the while loop i need to get the vi to scan the string to find the start of the first packet and then begin writing it to an array using a shift register for each sample with the left over from the string if not reading the full packet being attached to the next read, then if the data is to be saved to a txt file get it to enter a for loop for the user to define the number of samples to be saved or use an index array option to save a subset of data to a txt file.
    Would greatly appreciate any device or examples of the most efficient way to set this up before i lose my sanity!  

    The easy way is a three step process.
    First, acquire 2n-1 bytes (39 in your case) so you are sure you have a full set of data.  Parse out the first full set using the string utilities (match pattern, split string, etc.).  The remainder of the string is the start of your next data set.
    Second, acquire enough points to get the next data set and parse it.
    Third, continue looping and getting 20 bytes (keep verifying your start and end so you know if there are glitches).
    A harder, but far more robust, way, is the following:
    Read in data until you get your start sequence.  A circular buffer would be useful for this (you will need to implement this yourself).
    Read in your data of interest and record/save/...
    Read in the terminator.
    Repeat as needed.
    For all of these, you can use a flat sequence in a loop, but a state machine is far easier and more flexible in the long run.  Check out the LabVIEW help and these forums for how to create and use a state machine.
    It is often easier to use arrays of U8s instead of strings for this type of data manipulation.  Under the hood, they are the same thing, so the conversions between the two are highly efficient.
    Let us know about any problems you run into.
    This account is no longer active. Contact ShadesOfGray for current posts and information.

  • Help needed in storing data in DSO or InfoCube

    Hi Experts ,
       I have a set of data which comes from R/3 in the following Format:
    FG     Base Model     Option 1     Option 2     Option 3
    FG1     BM 1                       Opt1       Opt2         opt3
    FG2     BM1                       Opt 1       Opt2        Opt 4
    FG3     BM3                       Opt5       Opt6        Opt7
    The data gets stored in the PSA in the above format.
    But i want to load the data from the PSA to the DSO or InfoCube in the following format.I want that if i do a display data in the DSO or the Infocube the output should look like the below :
    FG     Base Model     Option
    FG1     BM1                      Opt1
    FG1     BM1                      Opt2
    FG1     BM1                      Opt3
    FG2     BM1                      Opt1
    FG2     BM1                      Opt2
    FG2     BM1                      Opt4
    FG3     BM3                      Opt5
    FG3     BM3                      Opt6
    FG3     BM3                      Opt7
    Is there any way to do this please help. Thanks in advance...

    Hi Samir,
    Use transformation rule group in the transformations.
    Here is one of the examples...the way you wanted.
    http://help.sap.com/saphelp_nw70/helpdata/EN/44/32cfcc613a4965e10000000a11466f/content.htm
    Thanks
    Ajeet

  • Help needed with passing data between classes, graph building application?

    Good afternoon please could you help me with a problem with my application, my head is starting to hurt?
    I have run into some difficulties when trying to build an application that generates a linegraph?
    Firstly i have a gui that the client will enter the data into a text area call jta; this data is tokenised and placed into a format the application can use, and past to a seperate class that draws the graph?
    I think the problem lies with the way i am trying to put the data into co-ordinate form (x,y) as no line is being generated.
    The following code is from the GUI:
    +public void actionPerformed(ActionEvent e) {+
    +// Takes data and provides program with CoOrdinates+
    int[][]data = createData();
    +// Set the data data to graph for display+
    grph.showGrph(data);
    +// Show the frame+
    grphFrame.setVisible(true);
    +}+
    +/** set the data given to the application */+
    +private int[][] createData() {+
    +     //return data;+
    +     String rawData = jta.getText();+
    +     StringTokenizer tokens = new StringTokenizer(rawData);+
    +     List list = new LinkedList();+
    +     while (tokens.hasMoreElements()){+
    +          String number = "";+
    +          String token = tokens.nextToken();+
    +          for (int i=0; i<token.length(); i++){+
    +               if (Character.isDigit(token.charAt(i))){+
    +                    number += token.substring(i, i+1);+
    +               }+
    +          }     +
    +     }+
    +     int [][]data = new int[list.size()/2][2];+
    +     int index = -2;+
    +     for (int i=0; i<data.length;i++){+
    +               index += 2;+
    +               data[0] = Integer.parseInt(+
    +                         (list.get(index).toString()));+
    +               data[i][1] = Integer.parseInt(+
    +                         (list.get(index +1).toString()));+
    +          }+
    +     return data;+
    The follwing is the coding for drawing the graph?
    +public void showGrph(int[][] data)  {+
    this.data = data;
    repaint();
    +}     +
    +/** Paint the graph */+
    +protected void paintComponent(Graphics g) {+
    +//if (data == null)+
    +     //return; // No display if data is null+
    super.paintComponent(g);
    +// x is the start position for the first point+
    int x = 30;
    int y = 30;
    for (int i = 0; i < data.length; i+) {+
    +g.drawLine(data[i][0],data[i][1],data[i+1][0],data[i+1][1]);+
    +}+
    +}+

    Thanks for that tip!
    package LineGraph;
    import java.awt.*;
    import java.awt.event.*;
    import javax.swing.*;
    import java.util.*;
    import java.util.List;
    public class GUI extends JFrame
      implements ActionListener {
      private JTextArea Filejta;
      private JTextArea jta;
      private JButton jbtShowGrph = new JButton("Show Chromatogram");
      public JButton jbtExit = new JButton("Exit");
      public JButton jbtGetFile = new JButton("Search File");
      private Grph grph = new Grph();
      private JFrame grphFrame = new JFrame();   // Create a new frame to hold the Graph panel
      public GUI() {
         JScrollPane pane = new JScrollPane(Filejta = new JTextArea("Default file location: - "));
         pane.setPreferredSize(new Dimension(350, 20));
         Filejta.setWrapStyleWord(true);
         Filejta.setLineWrap(true);     
        // Store text area in a scroll pane 
        JScrollPane scrollPane = new JScrollPane(jta = new JTextArea("\n\n Type in file location and name and press 'Search File' button: - "
                  + "\n\n\n Data contained in the file will be diplayed in this Scrollpane "));
        scrollPane.setPreferredSize(new Dimension(425, 300));
        jta.setWrapStyleWord(true);
        jta.setLineWrap(true);
        // Place scroll pane and button in the frame
        JPanel jpButtons = new JPanel();
        jpButtons.setLayout(new FlowLayout());
        jpButtons.add(jbtShowGrph);
        jpButtons.add(jbtExit);
        JPanel searchFile = new JPanel();
        searchFile.setLayout(new FlowLayout());
        searchFile.add(pane);
        searchFile.add(jbtGetFile);
        add (searchFile, BorderLayout.NORTH);
        add(scrollPane, BorderLayout.CENTER);
        add(jpButtons, BorderLayout.SOUTH);
        // Exit Program
        jbtExit.addActionListener(new ActionListener(){
        public void actionPerformed(ActionEvent e) {
             System.exit(0);
        // Read Files data contents
         jbtGetFile.addActionListener(new ActionListener(){
         public void actionPerformed( ActionEvent e) {
                   String FileLoc = Filejta.getText();
                   LocateFile clientsFile;
                   clientsFile = new LocateFile(FileLoc);
                        if (FileLoc != null){
                             String filePath = clientsFile.getFilePath();
                             String filename = clientsFile.getFilename();
                             String DocumentType = clientsFile.getDocumentType();
         public String getFilecontents(){
              String fileString = "\t\tThe file contains the following data:";
         return fileString;
           // Register listener     // Create a new frame to hold the Graph panel
        jbtShowGrph.addActionListener(this);
        grphFrame.add(grph);
        grphFrame.pack();
        grphFrame.setTitle("Chromatogram showing data contained in file \\filename");
      /** Handle the button action */
      public void actionPerformed(ActionEvent e) {
        // Takes data and provides program with CoOrdinates
        int[][]data = createData();
        // Set the data data to graph for display
        grph.showGrph(data);
        // Show the frame
        grphFrame.setVisible(true);
      /** set the data given to the application */
      private int[][] createData() {
           String rawData = jta.getText();
           StringTokenizer tokens = new StringTokenizer(rawData);
           List list = new LinkedList();
           while (tokens.hasMoreElements()){
                String number = "";
                String token = tokens.nextToken();
                for (int i=0; i<token.length(); i++){
                     if (Character.isDigit(token.charAt(i))){
                          number += token.substring(i, i+1);
           int [][]data = new int[list.size()/2][2];
           int index = -2;
           for (int i=0; i<data.length;i++){
                     index += 2;
                     data[0] = Integer.parseInt(
                             (list.get(index).toString()));
                   data[i][1] = Integer.parseInt(
                             (list.get(index +1).toString()));
         return data;
    public static void main(String[] args) {
    GUI frame = new GUI();
    frame.setLocationRelativeTo(null); // Center the frame
    frame.setDefaultCloseOperation(JFrame.EXIT_ON_CLOSE);
    frame.setTitle("Clients Data Retrival GUI");
    frame.pack();
    frame.setVisible(true);
    package LineGraph;
    import javax.swing.*;
    import java.awt.*;
    public class Grph extends JPanel {
         private int[][] data;
    /** Set the data and display Graph */
    public void showGrph(int[][] data) {
    this.data = data;
    repaint();
    /** Paint the graph */
    protected void paintComponent(Graphics g) {
    //if (data == null)
         //return; // No display if data is null
    super.paintComponent(g);
    //Find the panel size and bar width and interval dynamically
    int width = getWidth();
    int height = getHeight();
    //int intervalw = (width - 40) / data.length;
    //int intervalh = (height - 20) / data.length;
    //int individualWidth = (int)(((width - 40) / 24) * 0.60);
    ////int individualHeight = (int)(((height - 40) / 24) * 0.60);
    // Find the maximum data. The maximum data
    //int maxdata = 0;
    //for (int i = 0; i < data.length; i++) {
    //if (maxdata < data[i][0])
    //maxdata = data[i][1];
    // x is the start position for the first point
    int x = 30;
    int y = 30;
    //draw a vertical axis
    g.drawLine(20, height - 45, 20, (height)* -1);
    // Draw a horizontal base line4
    g.drawLine(20, height - 45, width - 20, height - 45);
    for (int i = 0; i < data.length; i++) {
    //int Value = i;      
    // Display a line
    //g.drawLine(x, height - 45 - Value, individualWidth, height - 45);
    g.drawLine(data[i][0],data[i][1],data[i+1][0],data[i+1][1]);
    // Display a number under the x axis
    g.drawString((int)(0 + i) + "", (x), height - 30);
    // Display a number beside the y axis
    g.drawString((int)(0 + i) + "", width - 1277, (y) + 900);
    // Move x for displaying the next character
    //x += (intervalw);
    //y -= (intervalh);
    /** Override getPreferredSize */
    public Dimension getPreferredSize() {
    return new Dimension(1200, 900);

  • Help needed setting up print server

    i got 2 printer:
    1 usb inkjet epson 1290
    1 hp5100 network
    i have 1 server:
    serving epson for our col proof using powerripx as driver and shared via os x print server
    i have 2 network interface card on the server.
    my problem is: the position of the server in the office has only one network interface port, i want to have my HP connected to the 2nd network interface of the server and the server connected to the only available network point there and thus have the server serving both printer.
    have set up the en1 interface, enable NAT, dhcp, firewall, but the server itself does not recognised the hp which is connected to the 2nd ethernet interface. when i move it on top of the ethernet list in system preferences it does recegnised it but the the whole print server is not accessible on the network.
    can someone with more experience help me set this up.
    thx

    I didn't say it's impossible. I just offered an easy solution to your problem that will save you and you users some hassle in the long run (eg. when you come to troubleshoot you wn;t have to look at 2 networks and a firewall). I guess I just like easy solutions. And ,dude, 20 bucks...
    If you want to go the hard(er) way you need to:
    1. set up a second network address range on the second card
    2. give the printer a fixed IP address within that range
    3. ensure firewall is not blocking packets from/to the printer
    4. config the printer on your print server & clients
    hth,
    b.

  • Help needed to insert data from different database

    Hi ,
    I have a requirement where i need to fetch data from different database through database link .Depending on user request , the dblink needs to change and data from respective table from mentioned datbase has to be fetched and populated .Could i use execute immediate for this, would dblink work within execute immediate .If not , could pls let me know any other approach .

    What does "the dblink needs to change" mean?
    Are you trying to dynamically create database links at run-time? Or to point a query at one of a set of pre-established database links at run-time?
    Are you sure that you really need to get the data from the remote database in real time? Could you use materialized views/ Streams/ etc to move the data from the remote databases to the local database? That tends to be far more robust.
    Justin

  • Help Needed Setting Up IMAP Relay?

    I an currently maintaining an OS X 10.4.11 Tiger server, running on a PowerMac G4 Mirror.
    Currently, we have our mail setup through our isp via pop (they do not offer imap). All of our employees have set up several access points to the pop server (i.e. home, office, smart phone, etc.). However, the problem arises that as each device is maintaining its own mail, when accessing mail on each device, mail read on one device still shows as unread on the others. As well as, any mail sent from one device, does not exist on the others.
    To eliminate the duplication, and missing mail, I thought setting up an imap server would keep all devices in sync.
    My thought is, collect all the mail from the isp pop server, onto the OS X server and then have the employees access only the imap server from all the devices.
    i.e.
    Setup imap server
    setup imap server to download mail from pop server, and save in the appropriate account.
    setup all client mail accounts to access imap server.
    I have been able to set up the imap server without any issues, however, I have not been able to figure out how to get the pop server mail from the isp into the local imap database.
    I hope this make sense.
    I know there are free services out there that will do what we want, but we do not want to rely on an external service if it is a process we can maintain internal.
    If anyone can help me set this up, or suggestions on how to approach this problem, I would be very grateful.
    TIA

    I'd probably have a conversation with the mail ISP along the lines of "add IMAP, please" and that will either be followed by happiness, or by migrating to another mail vendor, or migrating to local mail services.
    I've bucketed mail over (via imap), and it's not really a good solution; there are a number of wheels that need be kept spinning and by the time the dust settles (outages, glitches, the usual cruft), you might as well either re-host mail to to another mail ISP that offers imap (gmail, for instance), or simply run your own mail server.
    Usual tools are imapsync and offlineimap, and those target (obviously) imap and not pop. (I haven't seen much that does a pop pull, but it's likely feasible.)
    If you decide to host your own, you'll probably want to upgrade to something newer than the PPC gear, and you'll need to sort out your public forward and reverse DNS for your mail server, and ensure all that matches your MX record.

  • Help Needed : Setting JDBC Datasource in Crystal XI using Sybase 12.5??

    All,
    We are migrating our systems to the Crystal XI from Crystal 9 Reporting Servers. I need some technical assistance on the Java API ( use of the Datasources) .
    Problem: Currently in Crystal 9 we are using OLEDB datasources to connect to various Sybase 12.5 databases (which is been set programmatically at runtime). After I changed the Java code for getting the report object from the Crystal XI server instead of 9.  I stared facing the dbLogon failed exceptions.. But surprisingly it does work randomly on only one datasource for a given session (AppServer Restart).
    Need Help on Topics:
    1. Is there any different way of configuring/usage the OLEDB datasources for CrXI? or any Java API changes while setting the datasource when using OLEDB?
    2. Configuration required for using Simple JDBC connection (Sybase) instead of OLEDB? Can I get detailed help on this.
    I will prefer using JDBC for CrXI using the Sybase 12.5, but there are very less documentation for how to configure and set databases at runtime.
    Many Thanks in Adv.
    Regards - Sudhir Deshmukh | Solutions Lead

    Hi Sudhir,
    For Building up a connection for OLEDB  we need :
    In order for Crystal Reports to connect to a database through OLE DB, there are some requirements that must be met:
    u2022     The database client software must be installed on the client machine.
    u2022     The client machine must be able to connect to the server from its client software
    u2022     The client softwareu2019s working directory (example. c:\orant\bin) must be in the Windows search path
    u2022     The OLE DB provider must be installed on the local machine.
    We have the connection for crystal reports through OLEDB :
    The process by which Crystal Reports access data from an OLE DB data source consists of these five layers:
    Crystal Reports Layer
    OLE DB Translation Layer
    OLE DB Layer
    DBMS Translation (OLE DB provider) Layer
    Database Layer
    The data translation is similar to the ODBC connection model. Crystal Reports uses CRDB_ADO.DLL to communicate to the OLE DB provider, which communicates to the database. Crystal Reports can connect to any database as long as that database has an OLE DB provider.
    When creating a new report in Crystal Reports, OLE DB data sources are found in the Create New Connection folder with the OLE DB (ADO) connection.
    Regards,
    Naveen.

  • Pros help needed with post data code

    i needed to change the code below to post the userAnswer from radio button group,  to an ASPX page so i can read the data in and post it to a database. can anyone help. thanks
    btnCheck.addEventListener(MouseEvent.CLICK, checkAnswer);
    function checkAnswer(evt:MouseEvent):void {
    userAnswer = String(rbg.selectedData);
        messageBox.text =  userAnswer + " has been clicked";

    //Create the loader object
    var prodloader:URLLoader = new URLLoader ();
    //Create a URLVariables object to store the details
    var variables: URLVariables = new URLVariables();
    //Createthe URLRequest object to specify the file to be loaded and the method ie post
    var url:String = "url here";
    var prodreq:URLRequest = new URLRequest (url);
    prodreq.method = URLRequestMethod.POST;
    prodreq.data = variables;
    function submitHandler(event:Event):void {
        variables.productId = whatever;
        prodloader.load(prodreq);
        btnSubmit.addEventListener(MouseEvent.CLICK, submitHandler);
        function contentLoaded(event:Event):void {
           //Stuff here
            prodloader.addEventListener(Event.COMPLETE, contentLoaded);

  • Help Needed in persisting data in cache from CEP on to a database

    Hi,
    We are trying to create a Oracle Complex Event Processing (CEP) application in which persist the data stored in the cache (Oracle Coherence) to a back end database
    Let me provide you the steps that I have followed:
    1)     Created a CEP project with cache implementation to store the events.
    2)     Have the following configuration in the context file:
    <wlevs:cache id="Task_IN_Cache" value-type="Task" key-properties="TASKID" caching-system="CachingSystem1">
    <wlevs:cache-store ref="cacheFacade"></wlevs:cache-store>
    </wlevs:cache>
    <wlevs:caching-system id="CachingSystem1" provider="coherence">
    <bean id="cacheFacade" class="com.oracle.coherence.handson.DBCacheStore">
    </bean>
    3)     Have the following in the coherence cache config xml:
    <cache-mapping>
    <cache-name>Task_IN_Cache</cache-name>
    <scheme-name>local-db-backed</scheme-name>
    </cache-mapping>
    <local-scheme>
    <scheme-name>local-db-backed</scheme-name>
    <service-name>LocalCache</service-name>
    <backing-map-scheme>
    <read-write-backing-map-scheme>
    <internal-cache-scheme>
    <local-scheme/>
    </internal-cache-scheme>
    <cachestore-scheme>
    <class-scheme>
    <class-name>com.oracle.coherence.handson.DBCacheStore</class-name>
    </class-scheme>
    </cachestore-scheme>
    </read-write-backing-map-scheme>
    </backing-map-scheme>
    </local-scheme>
    4)     Have configured tangosol-coherence-override.xml to make use of coherence in my local machine.
    5)     Have written a class that implements com.tangosol.net.cache.CacheStore
    public class DBCacheStore implements CacheStore{
    But when I try to deploy the project on to the CEP server getting the below error:
    org.springframework.beans.factory.BeanCreationException: Error creating bean with name 'Task_IN_AD': Cannot resolve reference to bean 'wlevs_stage_proxy_forTask_IN_Cache' while setting bean property 'listeners' with key [0]; nested exception is org.springframework.beans.factory.BeanCreationException: Error creating bean with name 'wlevs_stage_proxy_forTask_IN_Cache': Cannot resolve reference to bean '&Task_IN_Cache' while setting bean property 'cacheFactoryBean'; nested exception is org.springframework.beans.factory.BeanCreationException: Error creating bean with name 'Task_IN_Cache': Invocation of init method failed; nested exception is java.lang.RuntimeException: Error while deploying application 'AIT_Caching'. Cache store specified for cache 'Task_IN_Cache' does not implement the required 'com.tangosol.net.cache.CacheStore' interface.
    Can you please let me know if I am missing any configuration. Appreciate your help.

    Hi JK,
    Yes my class com.oracle.coherence.handson.DBCacheStore implements the com.tangosol.net.cache.CacheStore interface. I am providing you with the DBCacheStore class.
    package com.oracle.coherence.handson;
    import com.tangosol.net.CacheFactory;
    import com.tangosol.net.NamedCache;
    import com.tangosol.net.cache.CacheStore;
    import java.sql.DriverManager;
    import java.sql.Connection;
    import java.sql.PreparedStatement;
    import java.sql.ResultSet;
    import java.sql.SQLException;
    import java.util.Collection;
    import java.util.Iterator;
    import java.util.LinkedList;
    import java.util.List;
    import java.util.Map;
    import oracle.jdbc.pool.OracleDataSource;
    public class DBCacheStore implements CacheStore {
    protected Connection m_con;
    protected String m_sTableName;
    private static final String DB_DRIVER = "oracle.jdbc.OracleDriver";
    private static final String DB_URL = "jdbc:oracle:thin:@XXXX:1521:XXXX";
    private static final String DB_USERNAME = "XXXX";
    private static final String DB_PASSWORD = "XXXX";
    public DBCacheStore()
    m_sTableName = "TASK";
    System.out.println("Inside constructor");
    init();
    //store("10002", "10002");
    public void init()
    try
         OracleDataSource ods = new OracleDataSource();
    /* Class.forName(DB_DRIVER);
    m_con = DriverManager.getConnection(DB_URL, DB_USERNAME, DB_PASSWORD);
    m_con.setAutoCommit(true);*/
         ods.setURL(DB_URL);
         ods.setUser(DB_USERNAME);
         ods.setPassword(DB_PASSWORD);
         m_con = ods.getConnection();
    System.out.println("Connection Successful");
    catch (Exception e)
         e.printStackTrace();
    //throw ensureRuntimeException(e, "Connection failed");
    public String getTableName() {
    return m_sTableName;
    public Connection getConnection()
    return m_con;
    public Object load(Object oKey)
    Object oValue = null;
    Connection con = getConnection();
    String sSQL = "SELECT TASKID, REQNUMBER FROM " + getTableName()
    + " WHERE TASKID = ?";
    try
    PreparedStatement stmt = con.prepareStatement(sSQL);
    stmt.setString(1, String.valueOf(oKey));
    ResultSet rslt = stmt.executeQuery();
    if (rslt.next())
    oValue = rslt.getString(2);
    if (rslt.next())
    throw new SQLException("Not a unique key: " + oKey);
    stmt.close();
    catch (SQLException e)
    //throw ensureRuntimeException(e, "Load failed: key=" + oKey);
    return oValue;
    public void store(Object oKey, Object oValue)
         System.out.println("Inside Store method");
         NamedCache cache = CacheFactory.getCache("Task_IN_Cache");
         System.out.println("Cache Service:" + " "+ cache.getCacheService());
         System.out.println("Cache Size:" + " "+ cache.size());
         System.out.println("Is Active:" + " "+ cache.isActive());
         System.out.println("Is Empty:" + " "+ cache.isEmpty());
         //cache.put("10003", "10003");
         //System.out.println("Values:" + " "+ cache.put("10003", "10003"));
    Connection con = getConnection();
    String sTable = getTableName();
    String sSQL;
    if (load(oKey) != null)
    sSQL = "UPDATE " + sTable + " SET REQNUMBER = ? where TASKID = ?";
    else
    sSQL = "INSERT INTO " + sTable + " (TASKID, REQNUMBER) VALUES (?,?)";
    try
    PreparedStatement stmt = con.prepareStatement(sSQL);
    int i = 0;
    stmt.setString(++i, String.valueOf(oValue));
    stmt.setString(++i, String.valueOf(oKey));
    stmt.executeUpdate();
    stmt.close();
    catch (SQLException e)
    //throw ensureRuntimeException(e, "Store failed: key=" + oKey);
    public void erase(Object oKey)
    Connection con = getConnection();
    String sSQL = "DELETE FROM " + getTableName() + " WHERE id=?";
    try
    PreparedStatement stmt = con.prepareStatement(sSQL);
    stmt.setString(1, String.valueOf(oKey));
    stmt.executeUpdate();
    stmt.close();
    catch (SQLException e)
    // throw ensureRuntimeException(e, "Erase failed: key=" + oKey);
    public void eraseAll(Collection colKeys)
    throw new UnsupportedOperationException();
    public Map loadAll(Collection colKeys) {
    throw new UnsupportedOperationException();
    public void storeAll(Map mapEntries)
         System.out.println("Inside Store method");
    throw new UnsupportedOperationException();
    public Iterator keys()
    Connection con = getConnection();
    String sSQL = "SELECT id FROM " + getTableName();
    List list = new LinkedList();
    try
    PreparedStatement stmt = con.prepareStatement(sSQL);
    ResultSet rslt = stmt.executeQuery();
    while (rslt.next())
    Object oKey = rslt.getString(1);
    list.add(oKey);
    stmt.close();
    catch (SQLException e)
    //throw ensureRuntimeException(e, "Iterator failed");
    return list.iterator();
    }

  • Help needed in loading data from BW(200) to APO(100).

    Hi everybody,
    I am trying to load a master data from BW(200) to AP1(100) , this is what i did.
    1) created a InfoObject, with some fields (general-ebeln, Attributes- bukrs, werks,matnr).
    2) created and activated communication structure.
    3) Then went to BW(200) created data source(sbiw) and extracted datas from a particular info object, which had same fields and saved it, then went to RSA3 checked for data availability in data source , and it was available there too.
    4) Came back to AP1(100), in the source system tab, opened BW(200) and replicated the datas. I was able to see the Data source name which is created in BW(200).
    5) Create and activated the Transfer struct.
    6) created a info package, and loaded the data, but the monitor says " NOT YET COMPLEATED" , "CURRENTLY IN PROCESS". and it also shows "0 of 0 RECORDS".
    I want to know,
    1) Is there any mistake in what i have done above ?
    2) how long will it take to complete the process (i.e. the loading) ?.
    Please help me through this problem.
    Thanks,
    Ranjani.

    Hi,
    I am surprised with your steps. In APO, you want to load data from a particular infoobject from BW. Why did you create a specific extractor in SBIW???
    You have just reinvented the wheel... It reminds me some people in the world ...
    Here is what you should do:
    - in BW, at the infosource level, you create a direct infosource based on the infoobject that you want to extract the data to APO (let's say 0MATERIAL)
    - in BW, at the infosource level, you right click on the infosource itself and you choose 'GENERATE EXPORT DATASOURCE. That will create the datasources for you (attributes, texts, hierarchies) depending on the infoobject settings. The names of these datasources will begin with a 8 for the datamart
    - in APO, you replicate the BW system. Now you find the datasources 80MATERIAL something
    - in APO, you create the transfer rules to your infosource and you can load
    Just give it a try
    Regards

  • Please Help~Need to swap data between two 2010 MacBook Pros

    Ok I have a 13" mid-2010 MacBook Pro and my wife has a 15" i7 2010 MacBook Pro. I need her MacBook's processing power to start doing some short videos for our church (After Effects, Premiere). She prefers the lighter 13" anyways so we've decided to swap. I've made two "complete" backups onto a partioned external hard drive using the software, Carbon Copy Cloner. My objective is to swap all data AND settings from one to another and vice versa. She has very important settings on her MBP that cannot be lost. What is the best route to take from here?
    Thanks in advance for your advice!
    Message was edited by: Muzik74

    Pretty easy, using the original Install Disc that came with each computer restart the computer while holding down the Option key. Choose the Install Disc to boot from. Then choose the language and next choose the Utilities Menu-Disk Utility. Once you are in Disk Utility choose the internal HD of the computer-Erase tab (ensure it's set to Mac OS Extended (Journaled)-Erase. Once the erase has been done then exit Disk Utility and continue with the installation. At the end of the installation it will ask if you want to restore from a Volume connected to the computer. Choose that option and choose all the options and within a couple of hours the machine will look and act like your old machine. Do the same with the other computer and you're done with the swap.

Maybe you are looking for

  • Tables for 2LIS_18_ORDER/2LIS_18_I3HDR  .

    Hi Xperts, Tables for 2LIS_18_ORDER/2LIS_18_I3HDR is not available in Help.sap.com. So could you please provide me  the tables for this data sources? Regards, Sap

  • UNdo error (ora-01555) - Snapshot too old error

    Hi, If undo get filled and if we get a snapshot too old error then what is the solution for this error, plz give step by step solution for this.

  • Issues With ADF-BC Business Events.

    Hi All, I was just trying out how ADF-BC business events works, so tried to create a small application based on scott.EMP table. In the EmpEO entity, created and published a new business event for employee creation. However, when I run the AM Tester

  • NameNotFoundException while reconciling OIM  from ldap

    Hi experts, please help me to resolve this OIM sync issue. When I ran LDAPConfigPostSetup.sh to reconcile OIM from OID all most all ldap entries are entered into OIM except few which are under cn=OracleContext. I am getting around 20 below messages :

  • No wireless status bars - airport settings don't stick

    I just got a new Macbook Pro 17" and loaded Leopard. The wireless strength indicator shows no bars even though I have the box checked to "show status in menu bar". It also says airport is not configured, even though I have configured it. I can find m