Creating streams and sensor data hub

Hi all,
I am working on the Getting Started example from OTN ( "HandsOnSession.pdf" )
and in the step to create stream and sensor data hub, am having a problem.
I am working with the Standalone SES v10.1.2 running on a Linux OS.
While running this command at the Linux Terminal,
sqlplus system/welcome1@orcl
I get a -bash: sqlplus command not found error
I am in the <filepath>/edge/sql $ when I execute it.But I observed that my Oracle edge server install folder doesnot have a sqlplus folder which I think is by default installed in the Oracle_Home folder. There is a sqlj folder though , but am not familiar much with sqlj.
Could anyone advise on how to connect to the database and create he SDH tables.
Also in the sqlplus system/welcome1@orcl command, could anyone confirm if the syntax is as below,
sqlplus <username>/<password>@orcl
where
<username> = username used to install the Edge server install
<password> = password used to install the Edge server install
would appreciate your response,
Thanks,
Vijay

Hi Vijay!
The EdgeServer doesn't have a full Oracle installation. Therefore a client tool such as sqlplus is not present. Install a Oracle Database on this machine (as I suspect you do not have a database yet) and use the sqlplus it comes with.
cu
Andreas

Similar Messages

  • ORA-39080: failed to create queues "" and "" for Data Pump job

    When I am running datapump expdp I receive the following error:
    +++Connected to: Oracle Database 10g Enterprise Edition Release 10.2.0.1.0 - Produc+++
    +++tion+++
    +++With the Partitioning, OLAP and Data Mining options+++
    +++ORA-31626: job does not exist+++
    +++ORA-31637: cannot create job SYS_EXPORT_SCHEMA_01 for user CHESHIRE_POLICE_LOCAL+++
    +++ORA-06512: at "SYS.DBMS_SYS_ERROR", line 95+++
    +++ORA-06512: at "SYS.KUPV$FT_INT", line 600+++
    +++ORA-39080: failed to create queues "" and "" for Data Pump job+++
    +++ORA-06512: at "SYS.DBMS_SYS_ERROR", line 95+++
    +++ORA-06512: at "SYS.KUPC$QUE_INT", line 1555+++
    +++ORA-01403: no data found+++
    Sys has the following two objects as invalid at present after running catproc.sql and utlrp.sql and manual compilation:
    OBJECT_NAME OBJECT_TYPE
    AQ$_KUPC$DATAPUMP_QUETAB_E QUEUE
    SCHEDULER$_JOBQ QUEUE
    While I run catdpb.sql the datapump queue table does not create:
    BEGIN
    dbms_aqadm.create_queue_table(queue_table => 'SYS.KUPC$DATAPUMP_QUETAB', multiple_consumers => TRUE, queue_payload_type =>'SYS.KUPC$_MESSAGE', comment => 'DataPump Queue Table', compatible=>'8.1.3');
    EXCEPTION
    WHEN OTHERS THEN
    IF SQLCODE = -24001 THEN NULL;
    ELSE RAISE;
    END IF;
    END;
    ERROR at line 1:
    ORA-01403: no data found
    ORA-06512: at line 7

    Snehashish Ghosh wrote:
    When I am running datapump expdp I receive the following error:
    +++Connected to: Oracle Database 10g Enterprise Edition Release 10.2.0.1.0 - Produc+++
    +++tion+++
    +++With the Partitioning, OLAP and Data Mining options+++
    +++ORA-31626: job does not exist+++
    +++ORA-31637: cannot create job SYS_EXPORT_SCHEMA_01 for user CHESHIRE_POLICE_LOCAL+++
    +++ORA-06512: at "SYS.DBMS_SYS_ERROR", line 95+++
    +++ORA-06512: at "SYS.KUPV$FT_INT", line 600+++
    +++ORA-39080: failed to create queues "" and "" for Data Pump job+++
    +++ORA-06512: at "SYS.DBMS_SYS_ERROR", line 95+++
    +++ORA-06512: at "SYS.KUPC$QUE_INT", line 1555+++
    +++ORA-01403: no data found+++
    Sys has the following two objects as invalid at present after running catproc.sql and utlrp.sql and manual compilation:
    OBJECT_NAME OBJECT_TYPE
    AQ$_KUPC$DATAPUMP_QUETAB_E QUEUE
    SCHEDULER$_JOBQ QUEUE
    While I run catdpb.sql the datapump queue table does not create:
    BEGIN
    dbms_aqadm.create_queue_table(queue_table => 'SYS.KUPC$DATAPUMP_QUETAB', multiple_consumers => TRUE, queue_payload_type =>'SYS.KUPC$_MESSAGE', comment => 'DataPump Queue Table', compatible=>'8.1.3');does it work better when specifying an Oracle version that is from this Century; newer than V8.1?

  • Time Streams and Shipping Dates

    Hello,
    Does anyone have any documentations regarding time streams? I am checking on how time streams and determination of shipping dates such as loading date, pricing date, GI date, etc. are related.
    I noticed that when we run ZZTTSTR program from SAP which clears time stream handles, before we create order documents, somehow some order documents determine wrong shipping dates.
    We setup a background job to run ZZTTSTR daily every night and during the day we upload order files which converts to about 500-1000 order IDOCs and posted to be order documents.
    Now, i removed the backround job of ZZTTSTR and the issue never happened again. I'm looking for more info about time streams so I can determine how this caused my issue.
    I hope someone can help.

    You will get it within the window you were originally promised by Apple. If you get it early, bonus.
    There is nothing else anyone here can say about it.
    Speculation and rumor are prohibited by the Apple Support Communities Terms of Use.

  • Problem with Create PDF and linked data

    Using Acrobat X standard.
    I have a Word 2010 doc with links to Visio and Excel data. 
    When using the CreatePDF option from Word, I consistently get a prompt asking whether I want to update the linked data before creating pdf.  I would like to either (1) turn this prompt off or (2) -- less preferable -- have the linked data always update without prompting.
    Is thre a setting that would accomplish one of these two solutions?
    Thanks,
    Craig
    PS.  All settings in Word have been set such that data IS NOT updated when printing on a regular printer.

    You can print to Adobe PDF.

  • Create files and compare data

    I have an aplication that can recive data (an integer and string), it stores it in a datafile and adds the date and time.  However, I need that everytime a new set of data cames, it ctompares it with the rest of the data stored in the same file.  If it the integer (key)  is repeated, it should be stored in a different file if not, it should be stored there.  
    I been able to read and write data files in different vi, but I can't makean  application that does everything at the same time.  I will appreaciate any suggestions.   I'm new with LabView and there is a lot I dont know yet. 
    Thank you!!!!

    Thanks for your suggestion.  However, the main problem is not how to read and write together but how to compare the data.  I started by writing and reading vi becuase I wanted to insolate every part of my program, be sure it works and then build on top of it.   The complete application should do something like:
    --Open two files, one for new data, one for repeted data
    -- User enters data -> a key as integer and some string
    -- compare the new key with ALL the keys already stored in the file.
          -- if the file is empty or no key is found in the file, write the data in the file for new data
         -- If the key is repeted, write the data in the file for repeted data
    -- Keep doing this until the user stops the application.
    What I really need is to be able to compare the recent entered data to what was entered before.  I chose to save data in a datalog because it seemed the best solution and because I also needed to store the date and time.  However, if there is any other way to store the data and compare it, please let me know.  I tried arrays and clusters by themselves but it wasnt easy.  Datalogs worked fine storing data but now comparing is too hard because I dont know how to read the file, record by record while comparing it to the new data I just got.
    I hope this is a little bit clearer!

  • Created by and creation date default values on form

    Hi Friends,
    I am a form newbie. I am creating a simple form.
    I have non-display column/fields:
    a.) CREATED_BY - how do I put the default value of the user login ID upon inserting record?
    b.) CREATION_DATE - how do I put the SYSDATE upon inserting the record?
    c.) UPDATED_BY - how do I put the default value of the user login ID when updating the record?
    d.) UPDATE_DATE - how do I put the SYSDATE when updating the record?
    Thanks a lot in advance

    Hi,
    In the PRE-INSERT trigger, write,
         :CREATED_BY    := GET_APPLICATION_PROPERTY(USERNAME);
         :CREATION_DATE := DATE;And in the PRE-UPDATE trigger, write,
         :UPDATED_BY  := GET_APPLICATION_PROPERTY(USERNAME);
         :UPDATE_DATE := DATE;Then your issue will be solved.
    Regards,
    Manu.
    If my response or the response of another was helpful, please mark it accordingly

  • Contacts creates duplicates and loses data

    Contacts has been causing problems: I created a new entry; this created over 36,000 duplicates. I entered data into the Notes section; this seemed to save - but when I returned the following day, it had disappeared and reset to an earlier version. The biggest problem is that it loses new data and does not update. 
    I also had to Force Quit in Activity Monitor an AddressBookSync app (I'm unsure of the specific name of the app), which had been hogging over 96% of CPU memory for over an hour and slowing everything else down. The core app and its supportive apps seem to have some fundamental bugs in them.
    Can anyone please:
    1. identify the likely causes
    2. identify the likely solution, based on (1)
    Many thanks.
    PS I've an iMac and an Airbook, both running Mavericks.

    Ah.. in that case you're referring to *.Mac Sync*, rather than iSync.
    The *.Mac Sync* forum is here: http://discussions.apple.com/forum.jspa?forumID=957
    It's probably better to post in there as that's where the .Mac Sync experts are!

  • Create item and save data...

    Hi friends,
    [Apps R12]
    If I create an item through personalization on a screen ... and I set that it references to an attribute of an existing VO in the screen.. it retrieves the attribute content succesfully if it has values... But.. if I put a new value on it.. it is not saved on database...
    Do I need to do anything more?
    Thanks.

    Hello,
    After creating the item , if set the view instance to ProjectBasicInfoVO and view attrinute = Attribute1.
    As Attribute1 value is passing as parameter. So it will work
    PA_PROJECTS_MAINT_PUB.UPDATE_PROJECT_BASIC_INFO( p_validate_only => FND_API.G_FALSE, p_project_id => :1, p_project_name => :2, p_project_number => :3, p_project_type => :4, p_description => :5, p_project_status_code => :6, p_public_sector_flag => :7, p_carrying_out_organization_id => :8, p_organization_name => :9, p_start_date => :10, p_completion_date => :11, p_territory_code => :12, p_country => :13, p_location_id => :14, p_state_region => :15, p_city => :16, p_attribute_category => :17, p_attribute1 => :18, p_attribute2 => :19, p_attribute3 => :20, p_attribute4 => :21, p_attribute5 => :22, p_attribute6 => :23, p_attribute7 => :24, p_attribute8 => :25, p_attribute9 => :26, p_attribute10 => :27, p_priority_code => :28, p_record_version_number => :29, p_recalculate_flag => :30, x_return_status => :31, x_msg_count => :32, x_msg_data => :33, p_target_start_date => :34, p_target_finish_date => :35, p_security_level => :36, p_long_name => :37, p_funding_approval_status => :38 )
    Thanks ,
    Kumar

  • Using Swing how can we create socket and send data thru TCP/IP on the socke

    Hi All,
    Can anyone tell the link or answer to me about the Socket programming using Swing and data get & post onto the socket.
    Thanx & Regard
    Ashu

    swing is nothing to do with socket programing, you need to code using core java and API of net
    so please go through this link [http://www.javaworld.com/javaworld/jw-12-1996/jw-12-sockets.html]

  • How to call routine and pass data to routine in vofm

    Hi Experts,
    I need to update KBETR and KWERT values present in 'Conditions Tab' in Purchase Order (ME21N/ME22N).
    I have created a new customer tab in which we enter amount field and  percentage filed. When user enters some value in this and clicks on 'Conditions Tab', calculation has to be done and the calculated value has to be appeared across a specific condition type.as i am new to abap  i dont know how to create routine and pass data to routine in vofm from customised tab in me21n .
                                                                                                                                                                          Thank's in advance

    Hello Rajendra,
    You can get plenty of forums in SCN related to it. Follow below steps to create VOFM routine.
    Go to VOFM Transaction Code
    1. On the Menu Select required Application i.e Pricing
    2. Enter any Number in between 600 to 999 for Custom Developments.
    3. On entering Pop Screen appears ask for Access Key(We have to remember that Every New Routine needs an Access Key)
    4. Once the Access Key is received we can do modification.
    5. Enter the Routine Number ,description and insert the Access Key
    6. Now the ABAP Editor will open and required code can be copied from Standard SAP Routine and Custom Code Can be developed.
    7. Once the coding is completed we have to Activate the Routine
    8. Select the Routine and Go to Edit – Activate
    9. Ensure that Active check box is ticked upon Activation of the Routine.
    10. Double click on the routine will enter into ABAP Editor, we have to generate the Routine
    11. Go to Program and select Generate
    12.A screen pops up with the related Main Programs  and select all required main programs wherever the Routine is being called.
    13. Once the Routine is Generated and Activated, We need to configure the Routine in the config.
    ** Important SAP note: 156230.
    Check the below document too.
    http://www.scribd.com/doc/35056841/How-to-create-Requirement-Routines
    Regards,
    Thanga

  • I2C interface (Sensor Data Acquisition) LabVIEW

    Hi all!
    Hope you are doing great!
    Well I have a question which is more about asking all you for an idea!
    The Situation:
    I have a circuit board which has an On-Off Valve, Digital pressure sensors (manufacturer AMD) and Humidity/Temperature Sensors (make- IST Hygrosens). On the board all the sensors communicate as I2C slave devices and all the data from the sensors is read into an I2C --> USB adapter chip which further connects to the PC via normal USB cable.
    Additional to this board, There is a Relay circuit with a simple 1-pole relay which controls an on-off valve on the above Circuit board. This valve is controlled totally separate via a coaxial cable from the relay directly to the Valve. But the relay board has a I2C interface and it also acts as a slave device. The relay board has the same I2C --> USB adapter chip.
    Both the Relay board and Sensor board connect via USB to the PC which I suppose is the Master device.
    The software code written for this arrangement and Sensor data acquisition is too old and there are a lot of problems coming. I have almost given up troubleshooting..
    I now want to translate this automation system onto LabVIEW. I searched the NI website where there is a DAC card called - USB8451 which supports I2C interface... I am a beginner in LabVIEW and cant really make sense out of how should I go about implementing this system on LabVIEW..
    If you guys can please help me out to atleast start (like what all hardware I would need etc..), to have a clear picture, it would be great help!!
    Looking forward to your inputs and Thank you so much in advance!
    Cheers!
    Pramit

    NI provides a LabVIEW API for the USB8451.  If you use the USB8451, you would use the provided API to write a program that controls the USB8451 and you would do all of the I2C communication in your program.  This would mean using functions / SubVI's to connect to the USB8451 and then perform I2C operations through it.
    If you use USB already on your device, then you would probably use NI-VISA as the driver and have to get / write your own API to talk to the specific device.  The manufacturer may have a LabVIEW (or other) API available for talking to the device that you could get.  If not, then you would have to understand the details of how to communicate with the device and then write an API using NI-VISA serial functions.  This would mean making NI-VISA be the assigned driver for the device and then using VISA Serial functions / SubVI's to send the messages and receive the responses.

  • Best approach to transfer and transform data between two Oracle Instances

    Let me first preface this post with the fact that I am an Oracle newb.
    The current problem I am trying to solve is how to quickly transfer data from a customers legacy database into a new normalized destination that is modeled differently than the legacy data. So, the data not only has to be transferred, but also transformed (into normalized tables)
    We have discovered, the hard way, that reading the data using a C++ application and performing inserts (even with indexing and constraints disabled) is just way too slow for what we need. We are dealing with around 20 million records here. I need to determine what the best approach extracting this data out of the source and inserting it into the destination. Any comments or tips are greatly appreciated.
    Note: I have read about SQL*Loader and mentioned it to management, but they seem resistant to this approach. It's not totally out of the question though.

    Oracle has a lot of technologies that fall under the general heading of "replication" to choose from. Going from a database to a flat file and back to another database is generally not the most efficient solution-- it's generally much easier to go straight from one database to another.
    Oracle provides materialized views, Streams, and Change Data Capture (CDC)-- any of these would be able to give your new system incremental feeds of data from the old system's data and could be used by whatever routines you write to transform the data between data models.
    Justin
    Distributed Database Consulting, Inc.
    http://www.ddbcinc.com/askDDBC

  • Create stream from csv in Stream Explorer: Source has not been saved properly

    I'm creating a demo application with Oracle Stream Explorer, but uploading a file does not work the way I expected.
    I go through the wizard "Create Stream"  and upload a 500kb csv file.In the final step I click Create, and a message pops up:
    Source has not been saved properly
    Unable to deploy OEP application 
    In the logs I find the following:
    INFO: Unable to check state of application "sx-5-7-eventbuffer": javax.management.RuntimeOperationsException: No deployment named [sx-5-7-eventbuffer]
    mrt 15, 2015 10:48:01 AM oracle.wlevs.strex.core.OEPRuntime deploy
    INFO: OEPRuntime.deploy: sx-5-7-eventbuffer, size=1732, time=214ms
    mrt 15, 2015 10:48:01 AM oracle.wlevs.strex.core.AbstractManager updateOEPApplication
    INFO: updateOEPApplication: 226ms
    mrt 15, 2015 10:48:01 AM oracle.wlevs.strex.core.SXContext close
    INFO: SourceService.createSource - 226ms
    mrt 15, 2015 10:48:01 AM oracle.wlevs.strex.service.SourceService createSource
    SEVERE: null
    oracle.wlevs.strex.model.SXException: Unable to deploy OEP application
        at oracle.wlevs.strex.core.OEPRuntime.deploy(OEPRuntime.java:128)
    Any ideas on what went wrong here?

    Hi Experts,
    After i tried a few times, i can successfully start prepare with upgrade asistant monitor - Administrator >> Start Prepare,
    Thanks

  • Oracle data hubs

    Can anyone tell me where I can download evaluation copies of Oracle data hubs? I checked the download site and there is no mention of Fusion and/or data hubs. TIA

    I know, so sorry. This software does not appear to be offered as a download at this time from a publically accessible Web Page. You will need to contact Oracle Corp at this juncture.

  • Best way to stream lots of data to file and post process it

    Hello,
    I am trying to do something that seems like it should be quite simple but am having some difficulty figuring out how to do it.  I am running a test that has over 100 channels of mixed sensor data.  The test will run for several days or longer at a time and I need to log/stream data at about 4Hz while the test is running.  The data I need to log is a mixture of different data types that include a time stamp, several integer values (both 32 and 64 bit), and a lot of floating point values.  I would like to write the data to file in a very compressed format because the test is scheduled to run for over a year (stopping every few days) and the data files can get quite large.  I currently have a solution that simply bundles all the date into a cluster then writes/streams the cluster to a binary file as the test runs.  This approach works fine but involves some post processing to convert the data into a format, typically a text file, that can be worked with in programs like Excel or DIAdem.   After the files are converted into a text file they are, no surprise, a lot larger than (about 3 times) the original binary file size.
    I am considering several options to improve my current process.  The first option is writing the data directly to a tdms file which would allow me to quicly import the data into DIAdem (or Excel with a plugin) for processing/visualization.   The challenge I am having (note, this is my first experience working with tdms files and I have a lot to learn) is that I can not find a simple way to write/stream all the different data types into one tdms file and keep each scan of data (containing different data types) tied to one time stamp.  Each time I write data to file, I would like the write to contain a time stamp in column 1, integer values in columns 2 through 5, and floating point values in the remaining columns (about 90 of them).  Yes, I know there are no columns in binary files but this is how I would like the data to appear when I import it into DIAdem or Excel.  
    The other option I am considering is just writing a custom data plugin for DIAdem that would allow me to import the binary files that I am currently creating directly into DIAdem.  If someone could provide me with some suggestions as to what option would be the best I would appreciate it.  Or, if there is a better option that I have not mentioned feel free to recommend it.  Thanks in advance for your help.

    Hello,
    Here is a simple example, of course here I only create one value per iteration in the while loop for simplicity. You can also set properties of the file which can be useful, and set up different channels.
    Beside, you can use multiple groups to have more flexibility in data storage. You can think of channels like columns, and groups as sheets in Excel, so you see this way your data when you import the tdms file into Excel.
    I hope it helps, of course there are much more advanced features with TDMS files, read the help docs!

Maybe you are looking for

  • Bug: Prompt on Union Report

    Hi I am using siebel analytics. there seems to be a bug with the dashboard prompt. I have YEAR and MONTHS_BETWEEN prompt, it works with individual report but when used with union report it doesnt work as required. Is this a documented bug or am I goi

  • Error while upgrading ODI 11.1.1.3 to 11.1.1.5

    Hello, I downloaded the patch for 11.1.1.5 and tried to run the psa.bat and I end up getting this error. Pls let me know if you have faced similar issue Thanks in advance. Oracle Fusion Middleware Patch Set Assistant 11.1.1.5.0 java.lang.UnsatisfiedL

  • Adobe premiere cc

    I can't start adobe premiere cc after installing

  • Display BLOB contents to client

    I want to display the blob contents in the clients window and give him the option either to open or save the contents. I am able to read the contents from Oracle Blob Field and stored the contents in the ByteArrayOutputStream object but I am not able

  • Bold/underlining text on iPhone 4 S

    How do I do this? Many thanks