Data Streams setup: ORA-01374

Hi,
Followed the datastreams document to setup simple replication On Solaris 8 , oracle 9204. While starting the capture process i am getting this error message
ORA-01374: log_parallelism greater than 1 not supported in this release
Even though my log_parallelism = 1. Exactly followed the document.
HELP PLEASE!!!
- Siva

I got exactly the same problem !!
Any suggestions ???

Similar Messages

  • Getting error while doing stream setup through OEM

    hi all,
    I am getting following error while doing stream setup for user stradmin (ORCL) db to stradmin_dest (orcldestination) on same windows m/c.
    java.sql.SQLException: ORA-20204: User does not exist: STRADMIN ORA-06512: at "SYSMAN.MGMT_USER", line 122 ORA-06512: at "SYSMAN.MGMT_JOBS", line 139 ORA-06512: at "SYSMAN.MGMT_JOBS", line 78 ORA-06512: at line 1
    I provided everything (host credential, usernames for new users and table capture)
    its urgent
    Help in advance

    hi,
    thanks for reply but DB user exists. It seems to be problem of WIN environment with Streams. I could solve it. But after whole setup and starting Apply and capture capture process goes for hours, not replicated at destination database. Normally how fast is replcation when done for small db like scott on table emp.
    thanks

  • Data-streaming feature not available in eWay BatchSFTP

    Hi all,
    I'm using JCAPS 5.1.2 and I've seen in the document
    �eWAY� BATCH ADAPTER USER�S GUIDE - Release 5.1.2�
    that for BatchFTP is possible to transfer file using the data-streaming feature, which is useful in case of big files
    ("Data Streaming: Allows your application to stream data directly to and from a local file system when used together with the BatchFTP OTD or the record-processing OTD. This feature minimizes the required RAM when large files are read, because
    the entire file is never loaded in memory.");
    I need to use BatchSFTP to transfer big files but with BatchSFTP data-streaming feature is not available (it is not reported into eway batch adapter user's guide);
    can somebody give me a suggestion about how to use BatchSFTP to transfer big files without consume all system resources (the file is stored in memory using the payload element of BatchSFTP) ?
    why data-streaming feature is not available for BatchSFTP ?
    Thanks in advance

    Hi,
    Given your post on this forum it seems you've BatchSFTP working. Unfortunataly I can't help you on your question but maybe you can help me out....
    I'm trying to setup BatchSFTP and having trouble in setting up the key-file. What kind of file(s) is(are) excepted in the key-file directories specified in the external system properties?
    I have tried a trusted_host file generated by a SSH client but get the following error:
    Batch SFTP eWay connection failed, method=[connect()], message=[Exception when connect(), e=java.io.IOException: Invalid SSH1 public key format].
    Thanks for your help!
    Cor Zijlstra
    [email protected]

  • PbyFwData - Pointer to the firmware binary data stream.

    Hello,
    I used labview to call a dll, but it have one parameter "pbyFwData" which Pointer to the firmware binary data stream.i don't understand what mean is "Pointer to the firmware binary data stream".I try to make this parameter point to the firmware path, but the program can't be run.the below is the .dll source code. who can tell you how to pass the value to the parameter "pbyFwData" by labview?
    // Purpose:      Public function to download the Flash Preloader.
    // Parameter:
    //   hZvDev    - Handle of the Zeevo Device.
    //   pbyFwData - Pointer to the firmware binary data stream.
    //   uFwLen    - The length of the preloader.
    // Return value: SUCCESS if successful; Otherwise, the error code.
    ZEEVOPTU_API int ZV4301_SendPreLoader(HANDLE hZvDev,
                                          unsigned char *pbyFwData,
                                          unsigned long uFwLen)
       ZV4301 *pZvDev;
       COMM_DESC *pPort;
       int nResult, nTmOut;
       const unsigned long uBlkSize = 0x400;
       unsigned long uDummy, uCRC;
       unsigned char byDataBuf[32];
       unsigned char byData;
       if (!Validate(hZvDev))
          return ERR_BAD_HANDLE;
       pZvDev = (ZV4301 *)hZvDev;
       pPort = pZvDev->PortDesc;
       // Setup the Address and Length
       memset(byDataBuf, 0, sizeof(byDataBuf));
       LD_LoaderProgramAddr(byDataBuf) = LD_ZV4301_LOADER_ADDR;
       LD_LoaderProgramLengh(byDataBuf) = uFwLen;
       // Calculate the Header CRC
       uCRC= UT_CRCBuffer(byDataBuf+4, 8);
       LD_LoaderProgramHdrCRC(byDataBuf) = uCRC;
       // Calculate the Data CRC
       uCRC = CalculateCRC(pbyFwData, uFwLen);
       LD_LoaderProgramDataCRC(byDataBuf) = uCRC;
       // Detect target reset
       nResult = IsChipPwrOnReset(pZvDev, 10000);
       if (nResult != SUCCESS)
          return nResult;
       // send out header first
       nResult = WriteSerial(pZvDev, byDataBuf, LD_LOADER_PRGM_HDR_LEN);
       if (nResult != SUCCESS)
          return nResult;
       while (uFwLen > 0)
       {  // sending the preloader file
          if (uFwLen >= uBlkSize)
             nResult = WriteSerial(pZvDev, pbyFwData, uBlkSize);
             pbyFwData += uBlkSize;
             uFwLen -= uBlkSize;
          else
             nResult = WriteSerial(pZvDev, pbyFwData, uFwLen);
             uFwLen = 0;
          if (nResult != SUCCESS)
             return nResult;
       nTmOut = 100;
       while (nTmOut > 0)
       {  // getting the respond 0xAA
          if (GetByte(pZvDev, &byData))
             if (byData == 0xAA)
                break;
          Sleep(10);
          nTmOut -= 10;
       if (nTmOut == 0)
          return ERR_DOWNLD_PRELDR;
       nResult = SetCommBaud(pZvDev, FLOWCTRL_DIS, pPort->uBaud, &uDummy);
       return nResult;

    Hi,
    The pointer is a U32 that points to a memory location where the data is. It is not clear from the prototype whether the pointer points to data that is feeded to the dll, or returned from the dll.
    If data is feeded, wire an array to this parameter, and set uFwLen to it's length.
    If data is returned, you need to use some other api calls to get the data from memory...
    Regards,
    Wiebe.
    "Jimmy168" <[email protected]> wrote in message news:[email protected]...
    Hello,I used labview to call a dll, but it have one parameter "pbyFwData" which Pointer to the firmware binary data stream.i don't understand what mean is "Pointer to the firmware binary data stream".I try to make this parameter point to the firmware path, but the program can't be run.the below is the .dll source code. who can tell you how to pass the value to the parameter "pbyFwData" by labview?
    &nbsp;
    //------------------------------------------------------------------------------// Purpose:&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; Public function to download the Flash Preloader.// Parameter://&nbsp;&nbsp; hZvDev&nbsp;&nbsp;&nbsp; - Handle of the Zeevo Device.//&nbsp;&nbsp; pbyFwData - Pointer to the firmware binary data stream.//&nbsp;&nbsp; uFwLen&nbsp;&nbsp;&nbsp; - The length of the preloader.// Return value: SUCCESS if successful; Otherwise, the error code.//------------------------------------------------------------------------------ZEEVOPTU_API int ZV4301_SendPreLoader(HANDLE hZvDev,&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; unsigned char *pbyFwData,&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; unsigned long uFwLen){ &nbsp;&nbsp; ZV4301 *pZvDev;&nbsp;&nbsp; COMM_DESC *pPort;&nbsp;&nbsp; int nResult, nTmOut;&nbsp;&nbsp; const unsigned long uBlkSize = 0x400;&nbsp;&nbsp; unsigned long uDummy, uCRC;&nbsp;&nbsp; unsigned char byDataBuf[32];&nbsp;&nbsp; unsigned char byData;
    &nbsp;&nbsp; if (!Validate(hZvDev))&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; return ERR_BAD_HANDLE;&nbsp;&nbsp; pZvDev = (ZV4301 *)hZvDev;&nbsp;&nbsp; pPort = pZvDev-&gt;PortDesc;
    &nbsp;&nbsp; // Setup the Address and Length&nbsp;&nbsp; memset(byDataBuf, 0, sizeof(byDataBuf));&nbsp;&nbsp; LD_LoaderProgramAddr(byDataBuf) = LD_ZV4301_LOADER_ADDR;&nbsp;&nbsp; LD_LoaderProgramLengh(byDataBuf) = uFwLen;&nbsp;&nbsp; // Calculate the Header CRC&nbsp;&nbsp; uCRC= UT_CRCBuffer(byDataBuf+4, 8);&nbsp;&nbsp; LD_LoaderProgramHdrCRC(byDataBuf) = uCRC;&nbsp;&nbsp; // Calculate the Data CRC&nbsp;&nbsp; uCRC = CalculateCRC(pbyFwData, uFwLen);&nbsp;&nbsp; LD_LoaderProgramDataCRC(byDataBuf) = uCRC;&nbsp;&nbsp; // Detect target reset&nbsp;&nbsp; nResult = IsChipPwrOnReset(pZvDev, 10000);&nbsp;&nbsp; if (nResult != SUCCESS)&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; return nResult;&nbsp;&nbsp; // send out header first&nbsp;&nbsp; nResult = WriteSerial(pZvDev, byDataBuf, LD_LOADER_PRGM_HDR_LEN);&nbsp;&nbsp; if (nResult != SUCCESS)&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; return nResult;
    &nbsp;&nbsp; while (uFwLen &gt; 0)&nbsp;&nbsp; {&nbsp; // sending the preloader file&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; if (uFwLen &gt;= uBlkSize)&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; {&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; nResult = WriteSerial(pZvDev, pbyFwData, uBlkSize);&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; pbyFwData += uBlkSize;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; uFwLen -= uBlkSize;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; }&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; else&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; {&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; nResult = WriteSerial(pZvDev, pb

  • SAP MII 14.0 SP5 Patch 11 - Error has occurred while processing data stream Dynamic Query role is not assigned to the Data Server

    Hello All,
    We are using a two tier architecture.
    Our Corp server calls the refinery server.
    Our CORP MII server uses user id abc_user to connect to the refinery data server.
    The user id abc_user has the SAP_xMII_Dynamic_Query role.
    The data server also has the checkbox for allow dynamic query enabled.
    But we are still getting the following error
    Error has occurred while processing data stream
    Dynamic Query role is not assigned to the Data Server; Use query template
    Once we add the SAP_xMII_Dynamic_Query role to the data server everything works fine. Is this feature by design ?
    Thanks,
    Kiran

    Thanks Anushree !!
    I thought that just adding the role to the user and enabling the dynamic query checkbox on the data server should work.
    But we even needed to add the role to the data server.
    Thanks,
    Kiran

  • Can we use 0INFOPROV as a selection in Load from Data Stream

    Hi,
    We have implemented BW-SEM BPS and BCS (SEM-BW - 602 and BI 7 ) in our company.
    We have two BPS cubes for Cost Center and Revenue Planning and we have Actuals Data staging cube, we use 0SEM_BCS_10 to load actuals.
    We created a MultiProvider on BPS cubes and Staging cube as a Source Data Basis for BCS.
    Issue:
    When loading plan data or Actuals data into BCS (0BCS_C11) cube using Load from Data Stream method, we have performance issue, We automated load process in a Process Chain. Some times it take about 20 hrs for only Plan data load for 3 group currencies and then elimination tasks.
    What I noticed is, (for example/) when loading Plan data, system is also reading Actuals Cube which is not required, there is no selection available in Mapping or selection tab where I can restrict data load from a particular cube.
    I tried to add 0INFOPROV into databasis but then it doen't show up as selection option in the data collection tasks.
    Is there a way where I can restrict data load into BCS using this load option and able to restrict on cube I will be reading data from ?
    I know that there is a filter Badi available, but not sure how it works.
    Thanks !!
    Naveen Rao Kattela

    Thanks Eugene,
    We do have other characteristics like Value Type (10 = Actual and 20 = Plan) and Version (100 = USD Actual and 200 = USD Plan), but when I am loading data into BCS using Load from Data Stream method, the request goes to all the underlying cubes, which in my case are Planning cubes and Actual Cube, but I don't want request to goto Actual Cube when I am running only Plan load. I think its causing some performance issue.
    For this reason I am thinking if I can use 0INFOPROV as we use in Bex queries to filter the InfoProvider so that the data load performance will improve. 
    I was able to to bring in 0INFOPROV into DataBasis by adding 0INFOPROV  in the characteristics folder used by the Data Basis.
    I am able to see this InfoObject Data Stream Fileds tab. I checked marked it to use it in the selection and regenerated the databasis.
    I was expecting that now this field would be available for selection in data collection method, but its not.
    So If its confirmed that there is no way we can use 0INFOPROV as a selection then I would suggest my client for a redesign of the DataBasis it self.
    Thanks,
    Naveen Rao Kattela

  • TS3989 I have photo stream setup on iPhoto on my macbook pro and I just got 2 iPhones for my family and only one streams photos to iCloud photo stream.  Both iPhones have the same Apple ID and have Photo Stream turned on.  Shouldn't both iPhones stream ph

    I have photo stream setup on iPhoto on my MacBook Pro and on 2 new iPhones that have the same AppleID as the MacBook Pro.  However only one iPhone will stream photos to iPhoto on the MacBook Pro.  All these devices were purchased in the last 2 weeks.  What setting is wrong?

    Photostream only syncs photos over Wi-Fi, make sure you are connected to Wi-Fi to begin with.
    Photostream only syncs when the camera app is closed, ensure it is closed.
    Photostream only syncs when the battery is above 20%, try recharging the device
    Try disabling photo stream on your device (settings > iCloud), restarting your device and then re-enabling photo stream.
    If this doesn't help you may need to reset photo stream. You can do this at icloud.com by clicking on your name in the top right corner and then the advanced settings in the pop-up dialogue box that appears.
    Note: disabling photostream and re-enabling it will result in any photos that are currently in the photostream album on your device, that are older than 30 days old, not being added back.

  • What does the Merge Data Stream processor do when there are multiple input streams to it from the same reader?

    Hi,
    I have a process with a reader of master data that outputs 5 records that feeds simultaneously into 3 different lookup and return processors.
    Each lookup and return processor brings some data back from a detail table. There can be multiple details so I follow each lookup processor with a split records from array processor. Hence I end up with 3 'streams' of data. Stream 1 has 8 records, stream 2 has 5 records and stream 3 has 6 records.
    I join all these streams to a Merge Data Streams processor.
    I end up with 9 records so although the help for the Merge Data Streams processor says 'Merge Data Streams does not perform any transformation, matching, or merging of records' there is clearly some merging going on.
    What is the behaviour of the merge data streams processor in this scenario?
    I have added attributes and flags into each of the streams. How many records should I see and what values should the added attributes/flags have (some records show attributes/flags from all 3 streams whereas others show just those attributes/flags from one stream).
    I have developed a test case simply to understand what the processor is doing but it's not obvious and furthermore it's probably unwise to develop EDQ processes where the processor behaviour is not documented and guaranteed to remain consistent. What I am trying to achieve is to bring all of a person's (the master data) various details (assignments, employers, etc.) together so we can check the data (some rules require data from multiple details).
    Thanks, Nik

    Cheers Mike - and for the explanation of the terms.
    I think I understand now how it's supposed to work.
    What I'm finding however is that when I set a flag to Y at the beginning of a path (that includes a lookup and return and then split records from array processor) that flag is showing no (i.e. an empty) value in SOME of the records shown in the subsequent MDS processor (it's fine the very last split processor before we get to the MDS but then again there are fewer records in that split processor than the MDS).
    In my case there are obviously more records in the MDS processor than there were in the original reader (because the lookup and returns are configured to have unlimited maximum matches). As mentioned, the different paths return different numbers of  records before being combined in the MDS. Say a reader has 5 records and path 1 returns 8 records in total including a path-specific flag (flag1, set to Y) but path 2 (that again adds its own path-specific flag (flag2, set to Y) returns just 5 records (since nothing was added from the lookups) are you saying that flag2 would show as 'Y' for all 8 records shown in the MDS?
    Hopefully you would be able to see what I mean if you try to create a process like the one I've described (or I can upload a package).
    Re. the purpose of the separate paths approach it is simply to allow the visualisation ('showing the working' as Neil puts it) of the different checks being carried out by the process.
    This is considered one of the benefits of the tool over writing SQL queries (with outer joins, query criteria, etc.).
    Also, as mentioned I was following an example that Neil put together for us to ensure that we are doing things in a 'proper' and supported way.
    If we put all the lookups, etc. for all the checks into one datastream then it no longer becomes so understandable and the value of joining processors in a process over simply writing SQL becomes questionable; arguably the EDQ process in fact becomes less easy to understand than simply writing SQL.
    Also, to go down this route I will need to revise the (what was previously substantially working until I revised it) processes that I have already developed.
    Thanks, Nik

  • Create a continuous data stream from C++, and read it in LabView

    Hello all.
    I'm working on a project which involves connecting to a motion tracker and reading position and orientation data from it in realtime. The code to get the data is in c++, so I decided that the best way to do this would be to create a c++ DLL file which contains all the necessary functions to first connect to the device and read the data from it, and use the Call Library Function node to feed this data into Labview. 
    I'm having trouble though, since ideally I would like a continuous stream of data from the c++ code into Labview, and I'm not sure how to achieve this. Putting the call library function node in a while loop seems like an obvious solution, but if I do it this way I would have to reconnect to the device every time I get the data, which is quite a bit too slow. 
    So my question is, if I created c++ function which created a data stream, could I read this into Labview without having to continually call a function? I'd prefer to only have to call a function once, and then read the data stream until a stop command is given.
    I'm using Labview 2010, version 10.0.
    Apologies if the question is poorly phrased, many thanks for your help.
    Dave
    Solved!
    Go to Solution.

    dr8086 wrote:
    This method sounds like an excellent suggestion, but I do have a few questions where I dont think I've understood fully.
    From what I understand the basic premise is to use one call library function node to access a DLL which creates an instance of the device object, and passes a pointer too it into labview. Then a seperate call library function node would pass this pointer to another DLL which could access the device object, update it and read the data. This part could be in a while loop and carry on reading the data until a stop command is given.
    That's it. I'm including some skeleton code as an example. I'm also including the code because I don't know how much you have experience with multi threading, so I'm showing how you'd have to use critical sections to guard the interactions between threads so that they don't lead to issues.
    // exported function to access the devices
    extern "C" __declspec(dllexport) int __stdcall init(uintptr_t *ptrOut)
    *ptrOut= (uintptr_t)new CDevice();
    return 0;
    extern "C" __declspec(dllexport) int __stdcall get_data(uintptr_t ptr, double vals[], int size)
    return ((CDevice*)ptr)->get_data(vals, size);
    extern "C" __declspec(dllexport) int __stdcall close(uintptr_t ptr, double last_vals[], int size)
    int r= ((CDevice*)ptr)->close();
    ((CDevice*)ptr)->get_data(last_vals, size);
    delete (CDevice*)ptr;
    return r;
    // h file
    // Represents a device
    class CDevice
    public:
    virtual ~CDevice();
    int init();
    int get_data(double vals[], int size);
    int close();
    // only called by new thread
    int ThreadProc();
    private:
    CRITICAL_SECTION rBufferSafe; // Needed for thread saftey
    vhtTrackerEmulator *tracker;
    HANDLE hThread;
    double buffer[500];
    int buffer_used;
    bool done; // this HAS to be protected by critical section since 2 threads access it. Use a get/set method with critical sections inside
    //cpp file
    DWORD WINAPI DeviceProc(LPVOID lpParam)
    ((CDevice*)lpParam)->ThreadProc(); // Call the function to do the work
    return 0;
    CDevice::~CDevice()
    DeleteCriticalSection(&rBufferSafe);
    int CDevice::init()
    tracker = new vhtTrackerEmulator();
    InitializeCriticalSection(&rBufferSafe);
    buffer_used= 0;
    done= false;
    hThread = CreateThread(NULL, 0, DeviceProc, this, 0, NULL); // this thread will now be saving data to an internal buffer
    return 0;
    int CDevice::get_data(double vals[], int size)
    EnterCriticalSection(&rBufferSafe);
    if (vals) // provides a way to get the current used buffer size
    memcpy(vals, buffer, min(size, buffer_used));
    int len= min(size, buffer_used);
    buffer_used= 0; // Whatever wasn't read is erased
    } else // just return the buffer size
    int len= buffer_used;
    LeaveCriticalSection(&rBufferSafe);
    return len;
    int CDevice::close()
    done= true;
    WaitForSingleObject(hThread, INFINITE); // handle timeouts etc.
    delete tracker;
    tracker= NULL;
    return 0;
    int CDevice::ThreadProc()
    while (!bdone)
    tracker->update();
    EnterCriticalSection(&rBufferSafe);
    if (buffer_used<500)
    buffer[buffer_used++]= tracker->getRawData(0);
    LeaveCriticalSection(&rBufferSafe);
    Sleep(100);
    return 0;
    dr8086 wrote:
    My main concern is that the object may go out of memory or be deallocated, since it wouldnt be held in any namespace or anything.
    Since you create the object with new, the object won't expire until either the dll is unloaded or the process (LabVIEW) closes. So the object will stay valid between dll calls provided LabVIEW didn't unload the dll (which it does if the VIs are closed). When that happens, I'm not exactly sure what happens to live objects (i.e. if you forgot to call close), I imagine the system reclaims the memory but the device might still be open.
    What I do to make sure that everything gets closed when the dll unloads before I could call close and delete the object is to everytime I create a new object in the dll I add it to a list, when the dll unloads, if the object is still on the list I delete it.
    dr8086 wrote:
    I also have a more general programming question about the purpose of the buffer. Would the buffer basically be a big table of position values, which are stored until they can be read into the rest of the VI? 
    Yes, see the example code.
    However, depending on the frequency with which you need to collect data from the device you might not need this buffer at all. I.e. if you collect a sample about every 100ms then you could remove all threading and buffer related functions and instead read the data from the read function itself like this:
    double CDevice::get_data()
    tracker->update();
    return tracker->getRawData(0);
     Because you'd only need a buffer and a seperate thread if you collect data at a high frequency and you cannot lose any data.
    Matt

  • Validation on Inventory and Supplier Data Stream

    Hi Experts Team,
    My client wants validation on Inventory and Supplier Data Stream.  When I am trying to create method it shows only total record and documents in data stream field.
    How I can get the above two stream in method.
    Can any one suggest me.

    Thanks Dan for your speedy reply
    I will do validation according to documet type
    Thanks & Regards
    Madhu

  • Conversion of a binary data stream to decimal numbers

    Hi
    I am having great difficult working out how to convert my binary data stream to decimal numbers.
    The data I am reading back is in the format of a binary string, starting with the Most Significant Bit (MSB) of the first word, then the corresponding Least Significant Bit (LSB), where a word is two bytes long. A carriage return indicates message termination.  The return message starts with ‘bin,’ followed by the number of bytes requested. No delimiters are used to separate the data, but a carriage return is appended onto the end of the data.
    bin,<first word msb><first word lsb>...<last word lsb><CR>
    e.g. bin,$ro¬z1;@*...etc
    Does anybody know of any examaple vi that can help me convert this data from binary to decimal numbers?
    Many Thanks
    Ash

    Hi Ashley,
    after getting the string you can strip the first 4 characters. After this try a typecasting to array of U16. If the numbers are not correct, you can add a swap bytes operation to the resulting array.
    Message Edited by GerdW on 09-13-2006 02:46 PM
    Best regards,
    GerdW
    CLAD, using 2009SP1 + LV2011SP1 + LV2014SP1 on WinXP+Win7+cRIO
    Kudos are welcome
    Attachments:
    Convert.png ‏2 KB

  • Data Streaming error in BPM workspace

    Hi All
    Developed a BPM Process and ADF form with BPM 11g suite and deployed to Oracle weblogic Server 10.3. while we are fetching the data from DB into ADF form in the same Process we are getting the small amount of data but while fetching the large volume of data to that ADF form we are getting the popup like "Data streaming",
    Could you please guide me to resolve this issue.
    Thank you
    Balaji J

    Please paste the exact error stack along with your Jdev version for us to help you.

  • Problems Reading SSL  server socket  data stream using readByte()

    Hi I'm trying to read an SSL server socket stream using readByte(). I need to use readByte() because my program acts an LDAP proxy (receives LDAP messages from an LDAP client then passes them onto an actual LDAP server. It works fine with normal LDAP data streams but once an SSL data stream is introduced, readByte just hangs! Here is my code.....
    help!!! anyone?... anyone?
    1. SSL Socket is first read into  " InputStream input"
    public void     run()
              Authorization     auth = new Authorization();
              try     {
                   InputStream     input     =     client.getInputStream();
                   while     (true)
                   {     StandLdapCommand command;
                        try
                             command = new StandLdapCommand(input);
                             Authorization     t = command.get_auth();
                             if (t != null )
                                  auth = t;
                        catch( SocketException e )
                        {     // If socket error, drop the connection
                             Message.Info( "Client connection closed: " + e );
                             close( e );
                             break;
                        catch( EOFException e )
                        {     // If socket error, drop the connection
                             Message.Info( "Client connection close: " + e );
                             close( e );
                             break;
                        catch( Exception e )
                             //Way too many of these to trace them!
                             Message.Error( "Command not processed due to exception");
                             close( e );
                                            break;
                                            //continue;
                        processor.processBefore(auth,     command);
                                    try
                                      Thread.sleep(40); //yield to other threads
                                    catch(InterruptedException ie) {}
              catch     (Exception e)
                   close(e);
    2 Then data is sent to an intermediate function 
    from this statement in the function above:   command = new StandLdapCommand(input);
         public StandLdapCommand(InputStream     in)     throws IOException
              message     =     LDAPMessage.receive(in);
              analyze();
    Then finally, the read function where it hangs at  "int tag = (int)din.readByte(); "
    public static LDAPMessage receive(InputStream is) throws IOException
        *  LDAP Message Format =
        *      1.  LBER_SEQUENCE                           --  1 byte
        *      2.  Length                                  --  variable length     = 3 + 4 + 5 ....
        *      3.  ID                                      --  variable length
        *      4.  LDAP_REQ_msg                            --  1 byte
        *      5.  Message specific structure              --  variable length
        DataInputStream din = new DataInputStream(is);
        int tag = public static LDAPMessage receive(InputStream is) throws IOException
        *  LDAP Message Format =
        *      1.  LBER_SEQUENCE                           --  1 byte
        *      2.  Length                                  --  variable length     = 3 + 4 + 5 ....
        *      3.  ID                                      --  variable length
        *      4.  LDAP_REQ_msg                            --  1 byte
        *      5.  Message specific structure              --  variable length
        DataInputStream din = new DataInputStream(is);
           int tag = (int)din.readByte();      // sequence tag// sequence tag
        ...

    I suspect you are actually getting an Exception and not tracing the cause properly and then doing a sleep and then getting another Exception. Never ever catch an exception without tracing what it actually is somewhere.
    Also I don't know what the sleep is supposed to be for. You will block in readByte() until something comes in, and that should be enough yielding for anybody. The sleep is just literally a waste of time.

  • TSV_TNEW_PAGE_ALLOC_FAILED - BCS load from data stream task

    Hi experts,
    We had a short dump when executing BCS Load from Data Stream task. The message is: TSV_TNEW_PAGE_ALLOC_FAILED.
    No storage space available for extending an internal table.
    What happened? How we can solve this error?
    Thanks
    Marilia

    Hi,
    Most likely, the remedy for your problem is the same as in my answer to your another question:
    Raise Exception when execute UCMON

  • SEM-BCS:Data Stream Upload

    Hi! All.
    I am facing an issue in Data Stream upload.....The Target field 0Company is 6 char. and the source field 0Company code is 4 char. in length....the system gives an error ...the value Target field exceeds source field..use info object with greater length.....however, upon mapping Company to another infoobject...with 6 char..used instead of 0comp-Code...the system returns yet
    another error that its not coming from source system or source system can't be determined i.e. the new info object.......
    I was thinking changing target field length to four...any work around this issue...If the target field is changed i.e. 0company what implications will it have or just changing the field in data model do the trick??
    Thanks for your input....
    Victor

    Hello Viktor,
    If i understood well your problem i faced the same thing in the past.
    When costumizing the load from data stream include the lenght to 4 characters or try to include an offset of 2.
    Hope is helps.
    If yes award points.
    Best regards,
    João Arvanas

Maybe you are looking for

  • My sound does not work on my ipad.

    My mute button is not on and I still have no sound in my ipad

  • CS4 working with cannon raw (.CR2) extremely slow

      Hello,   I am hopeing there is some one that can help me or point me in the right direction. I am useing Photoshop cs4 extended on a windows xp home edition. When I use CS4 to open a raw file from a canon 50D CS4 slows to a crawl, and by crawl I me

  • Help required in setting and getting attribute

    here i have created a jsp file in the first file the user enter his names and in second jsp file the user get his name and third file i need tp pass the parameter <html> <head> <meta http-equiv="Content-Type" content="text/html; charset=UTF-8"> <titl

  • Problem with combo box in a Matrix column

    hi every one, i  have a user form with a matrix(uid = 37) with some columns and one column is combo box. in that combo box i have to get values  from OSTC table (Code)...i have written this code ..i dont know how to access the combo box so that i can

  • Contact / calendar sync

    I am no longer able to sync my calendar and contacts between my macbook Pro and iphone 5s - any suggestions?