What is local data?

and why does it say that our of the 174.72GB I have used 174.51?
My Mac used to be a school laptop then they were wiped and we paid for them to keep them. However, all of my local data is almost completely used?

Hi Severin,
The output from that property node is of variant data type. You can convert it to boolean by using the Variant to Data VI. See attached example
Sarah
Applications Engineer | National Instruments | UK & Ireland
Attachments:
data types of local variables_7.11.vi ‏41 KB

Similar Messages

  • What version of ipad 4g shoud i buy here in the US that can work in the philippines local data provider? at

    what version of ipad 4g shoud i buy here in the US that can work in the philippines local data provider? at&t or verizon? any idea anyone?

    Ah, I see, ask your provider what type of cell service they use the following outlines what the two iPads offer.
    Model for AT&T: 4G LTE (700, 2100 MHz)3; UMTS/HSPA/HSPA+/DC-HSDPA (850, 900, 1900, 2100 MHz); GSM/EDGE (850, 900, 1800, 1900 MHz)
    Model for Verizon: 4G LTE (700 MHz)3; CDMA EV-DO Rev. A (800, 1900 MHz); UMTS/HSPA/HSPA+/DC-HSDPA (850, 900, 1900, 2100 MHz); GSM/EDGE (850, 900, 1800, 1900 MHz)
    Also ensure you don't buy a carrier locked/subsidized iPad.  Buy it free and clear without a contract so you can use it on your terms.
    I know a lot of internal providers have pay as you go data SIMs and you can basically swap these out as you travel between different countries. 

  • What is open data set and close data set

    what is open data set and close data set,
    how to use the files in sap directories ?

    hi,
    Open Dataset is used to read or write on to application server ... other than that i am not sure that there exists any way to do the same ... here is a short description for that
    FILE HANDLING IN SAP
    Introduction
    • Files on application server are sequential files.
    • Files on presentation server / workstation are local files.
    • A sequential file is also called a dataset.
    Handling of Sequential file
    Three steps are involved in sequential file handling
    • OPEN
    • PROCESS
    • CLOSE
    Here processing of file can be READING a file or WRITING on to a file.
    OPEN FILE
    Before data can be processed, a file needs to be opened.
    After processing file is closed.
    Syntax:
    OPEN DATASET <file name> FOR {OUTPUT/INPUT/APPENDING}
    IN {TEXT/BINARY} MODE
    This statement returns SY_SUBRC as 0 for successful opening of file or 8, if unsuccessful.
    OUTPUT: Opens the file for writing. If the dataset already exists, this will place the cursor at the start of the dataset, the old contents get deleted at the end of the program or when the CLOSE DATASET is encountered.
    INPUT: Opens a file for READ and places the cursor at the beginning of the file.
    FOR APPENDING: Opens the file for writing and places the cursor at the end of file. If the file does not exist, it is generated.
    BINARY MODE: The READ or TRANSFER will be character wise. Each time ‘n’’ characters are READ or transferred. The next READ or TRANSFER will start from the next character position and not on the next line.
    IN TEXT MODE: The READ or TRANSFER will start at the beginning of a new line each time. If for READ, the destination is shorter than the source, it gets truncated. If destination is longer, then it is padded with spaces.
    Defaults: If nothing is mentioned, then defaults are FOR INPUT and in BINARY MODE.
    PROCESS FILE:
    Processing a file involves READing the file or Writing on to file TRANSFER.
    TRANSFER Statement
    Syntax:
    TRANSFER <field> TO <file name>.
    <Field> can also be a field string / work area / DDIC structure.
    Each transfer statement writes a statement to the dataset. In binary mode, it writes the length of the field to the dataset. In text mode, it writes one line to the dataset.
    If the file is not already open, TRANSFER tries to OPEN file FOR OUTPUT (IN BINARY MODE) or using the last OPEN DATASET statement for this file.
    IF FILE HANDLING, TRANSFER IS THE ONLY STATEMENT WHICH DOES NOT RETURN SY-SUBRC
    READ Statement
    Syntax:
    READ DATASET <file name> INTO <field>.
    <Field> can also be a field string / work area / DDIC structure.
    Each READ will get one record from the dataset. In binary mode it reads the length of the field and in text mode it reads each line.
    CLOSE FILE:
    The program will close all sequential files, which are open at the end of the program. However, it is a good programming practice to explicitly close all the datasets that were opened.
    Syntax:
    CLOSE DATASET <file name>.
    SY-SUBRC will be set to 0 or 8 depending on whether the CLOSE is successful or not.
    DELETE FILE:
    A dataset can be deleted.
    Syntax:
    DELETE DATASET <file name>.
    SY-SUBRC will be set to 0 or 8 depending on whether the DELETE is successful or not.
    Pseudo logic for processing the sequential files:
    For reading:
    Open dataset for input in a particular mode.
    Start DO loop.
    Read dataset into a field.
    If READ is not successful.
    Exit the loop.
    Endif.
    Do relevant processing for that record.
    End the do loop.
    Close the dataset.
    For writing:
    Open dataset for output / Appending in a particular mode.
    Populate the field that is to be transferred.
    TRANSFER the filed to a dataset.
    Close the dataset.
    Regards
    Anver
    if hlped pls mark points

  • Strategies for Internationalization - localized data ?

    I am looking for some advice/tips/best practice for working with localized data.
    Java Locales are nice (at least for working with stuff that fits into property files) and JDeveloper's has some of the best support I have seen for multilingual UIs. However, I am a bit at a loss for how to best approach this for localized data. That is, not field names but database stored values.
    For example, for a online store you have a product catalog. For each product you description in various languages. In terms of Java Locale logic, some products might have a description in en, en_US, en_GB, en_CA, fr, fr_FR, fr_CA, others might have just en, fr others might have some other sets of languages. So for a clients locale, there might not be a simple singular locale in the database.
    So then, what's a good approach for Java/BC4J design and database schema?
    Actually storing the data doesn't seem like a big deal.
    I was thinking of doing a table like
    PRODUCT_DESCRIPTION
    product_id PK (FK - PRODUCT)
    local_code PK
    product_description
    then I'd have a LOCAL table that had langauge code, country code, etc. (But maybe there's a better approach for working with the data via BC4J and Java.)
    However, I don't see a good way to work with the data in Java. It could get out of control trying to display a search results of products, where for each product you have to work you way through the logic of LOCALEs from most specific up to default just to get the description.
    If it matters, this would be a webapp, so I'd have the browser settings, servlet2.3 locales, etc to work with. Also, I might have 200,000 products (which actually join up to a few million catalog items for pricing).
    So what would be good approach (simple and performant) for this?
    How would you set up your BC4J entities and views, etc and pull back the right langauage?
    Is there any good info out there on working with database driven internationalized data in Java? Generally, everyone talks about property files and field labels and never mentions a database.
    thanks,
    -S

    I adopted this solution that works fine, but has some disadvantages (I tell about them later).
    I have a lot of Tables with descriptions (like country, products, etc...) in various languages.
    My application is named Weborder, so I created with the wizard the two classes associated with the application module Weborder.java and WeborderImpl.java.
    In WeborderImpl.java I have a variable (lingua, I used an int so I can use then a switch construct) which stores the information about the language (and also a DEBUG flag)
    private boolean DEBUG = true;
    private int lingua = 1;
    * This is the default constructor (do not remove)
    public WeborderImpl() {
    public boolean isDEBUG() {
    return DEBUG;
    public void setDEBUG(boolean newDEBUG) {
    DEBUG = newDEBUG;
    /* GESTIONE LINGUA */
    public int getLingua() {
    return lingua;
    public void setLingua(int ling) {
    //I have 4 languages
    if ( ling>=1 && ling<=4 ) {
    lingua = ling;
    else {
    if (DEBUG) System.err.println("Lingua "+ling+" NON CORRETTA");
    public void setLingua(String s) {
    try {
    setLingua(Integer.parseInt(s));
    catch (Exception e) {
    if (DEBUG) System.err.println("Lingua "+s+" NON AMMESSA");
    /* GESTIONE LINGUA */
    The accessor methods are visible through the Weborder.java interface class (you do this with the edit appl. module wizard, in the "Client Method" tab).
    When the user logs in, I set the language in the Application Module, so it is visible from every View in the Application Module. Here is a method that I call from the jsp page that checks the login:
    public static boolean validSignOn(HttpServletRequest request, ApplicationModule am) {
    setLingua(wo,(String)user.get("lingua"));
    return ok;
    public static void setLingua(ApplicationModule am, String newLingua) {
    Weborder wo = (Weborder)am;
    try {
    wo.setLingua(Integer.parseInt(newLingua));
    catch (Exception e) {
    System.err.println("WOLogin; lingua errata :"+newLingua);
    In the jsp page it is invoked in this way:
    <jbo:ApplicationModule id="Weborder" configname="webord.dm.Weborder.WeborderLocal" releasemode="Stateful" />
    <jbo:DataSource id="loginVo" appid="Weborder" viewobject="LoginView" rangesize="-1" />
    <% boolean ok = WOLogin.validSignOn(request,loginVo.getRowSet().getApplicationModule()); %>
    You can invoke WOLogin.setLingua() from every jsp page to switch the language (and similarly the DEBUG variable).
    Then I have a View (DOrdiniView) that displays the detail of an order, and I want that the products gets the right description.
    - Using the wizard I select the two entities (the first with the order lines "DOrdini", the second with the product description "DArtOrdBol") and there I add a transient Attribute called Description with type String.
    - Using the wizard I create the two java class DOrdiniViewImpl and DOrdiniViewRowImpl
    - In the class DOrdiniViewRowImpl.java I modify the get Description method:
    public String getDescription() {
    // return (String)getAttributeInternal(DESCRIPTION);
    int lingua = ((Weborder)getApplicationModule()).getLingua();
    switch (lingua) {
    case 2: return getDArtOrdBol().getDescrizioneTed();
    case 3: return getDArtOrdBol().getDescrizioneUk();
    case 4: return getDArtOrdBol().getDescrizioneFra();
    default: return getDArtOrdBol().getDescrizioneIta();
    And this is all.
    When I use this ViewObject I have only to call the method
    getAttribute("Description"), which returns the description in the right language. I never have to care about the language, I only have to set it and everything works.
    All this works fine, but there are two problems:
    1. for every view object you create, you have to override the method getDescription() in the ViewRowImpl.java class, but this is not so problematic;
    2. if you have to use that attribute in a SQL statement, you can't! An example is for an ORDER BY clause (for a "SELECT .. WHERE description LIKE ..." statement, I use a column that is composed of all the descriptions).
    It would be a better idea to have such a method directly in the Entity Objects, so you can use the column in the SQL statements and you have to insert java code only once per entity.
    The problem is that the language is specific for the user session, so the better place to store it is (I think) in the Application Module.
    Initially I used another approach, inserting the description as a column like this:
    decode(?,2,DESCR_DE,3,DESCR_UK,4,DESCR_FR,DESCR_IT) AS DESCRIPTION
    and passed the parameter every time I used the view object, using the method setWhereClauseParam(..,..) of the class ViewObject.
    This was not a so good idea because I got a lot of "not all variables bound" SQLException about ViewObjects that I was not using in the particular jsp page I invoked with the browser (view objects that were childs of the view object I was using).
    So, if this can help...
    I hope someone can give some enhancement on this solution, that solves the problems I have just pointed out.
    bye,
    Marco.

  • Local data persistence in Interactive workflows

    I have read through all the threads regarding Data persistence and do understand that form data is stored in xml format. I still have the following doubt:
    For all the custom workflows we are developing for this client (using Adobe interactive forms as the UI in the workflows) we want to have local data persistence. Is there a model which covers data collected through the forms, including those data that are retrieved from SAP at the beginning of the workflow, and those data that are manually entered by the forms user?
    Scenario :<b>  cost xx   - approver 1
                     cost xx + 1 - approver 2
    Form data filled, cost xx and hence a email goes to approver 1.
        Now, details changes and cost for that sales order increases to xx + 1.</b>
    Will my email still be in the inbox of approver 1?
    What if I want data persistence where from the initiation to the end of workflow the data should be persisted irrespective of the changes.

    Hello Kavitha,
    Your workitem will stay where it is unless you've specifically modelled it to react to a change. In standard purchasing you would use a change event triggered from ME22N or the likes, but unfortuantely I'm not in depth familiar with Adobe Forms so I don't know what happens when data is changed on a form once the WF has started.
    However if you want data to remain constant throughout the life of the WF then I'd suggest binding it to a WF container when the doc is first created and then working with that value.
    Hope that helps,
    Mike

  • How to get the local date time value

    Hi,
    I am Australian time zone. I have been using the below code to retrieve meeting start date time from my Sharepoint Calendar.
    Console.WriteLine((DateTime)oListItem["EndDate"]);
    This used to return the correct local date up until today. I simply read this and store in a database. Today when I check I can see that the date returned is different to what I can see in the Sharepoint calendar.
    For and example: In Sharepoint calendar I can see the date as "Oct 1 2014 5:30 PM". But my above code now returns "Oct 1 2014 6:30 AM". This looks to me like UTC time.
    But up until today my code above returned the correct local date. Is there any reason for this sudden chahge?
    When I read the TimeZone in sharepoint (using RegionalSettings.TimeZone method) it return the correct time zone too.
    Any help would be appreciated. I am concerned that I have to now go and change all places where I read the date time from sharepoint.
    Thanks, Bimal
    Bimal

    Hi,
    According to your post, my understanding is that you had some issue about the calendar time.
    As your code worked well before, the issue may be related to the calendar itself or the site.
    As the TimeZone is correct, you can recreate a new calendar to check whether it works.
    Or create a new site to check whether it works.
    I had made a simple demo to retrieve the calendar EndDate time, you can also check with it.
    using (SPSite site = new SPSite("http://YourSiteURL "))
    using (SPWeb web = site.OpenWeb())
    SPList spList = web.Lists.TryGetList("Calendar1");
    if (spList != null)
    SPQuery qry = new SPQuery();
    qry.ViewFields = @"<FieldRef Name='EventDate' /><FieldRef Name='EndDate' />";
    SPListItemCollection listItems = spList.GetItems(qry);
    foreach (SPListItem item in listItems)
    string startTime = item["EventDate"].ToString();
    string endTime = item["EndDate"].ToString();
    Console.WriteLine("Start Time:"+startTime);
    Console.WriteLine("End Time:"+endTime);
    Thanks,
    Jason
    Forum Support
    Please remember to mark the replies as answers if they help and unmark them if they provide no help. If you have feedback for TechNet Subscriber Support, contact
    [email protected]
    Jason Guo
    TechNet Community Support

  • Local Data AddressDoes not belong to any of this hosts local interfaces

    I'm facing this error when I try to run the Applet from a client that download the applet and try to connect to de ip address of the origin.
    When I run localy, there is no error!
    Example: On 10.1.0.2, I browse 10.1.0.1/transmitapplet.html, and the error:Local Data AddressDoes not belong to any of this hosts local interfaces occurs.
    On 10.1.0.1, I browse 10.1.0.1/transmitapplet.html and the applet run properly.
    Anybody knows what is the problem?

    Sorry, but I hadn't have this problem till last week. Then I tried out to run my Application as Applet...
    The problem is, that a applet is not allowed to read the local IP-Address. I've solved this a little bit strange, by using a dummy SessionManager:
    public String getLocalAddress() {
    String result = null;
    SessionAddress addr = null;
    RTPSessionMgr mngrSession = (RTPSessionMgr) RTPManager.newInstance();
    if (mngrSession != null) {
    SessionAddress sessAddr = new SessionAddress();
    // empty address will use standard IP
    try {
    mngrSession.initialize(sessAddr);
    addr = mngrSession.getLocalSessionAddress();
    catch (InvalidSessionAddressException e1) {}
    if (addr != null) {
    result = addr.getDataHostAddress();
    mngrSession.closeSession();
    mngrSession.dispose();
    return result;
    else return null;
    This method does not support multiple IP-addresses.

  • SPROXY: No connection to Integration Builder (only local data visible)

    ¡Hola!
    When I try to access the transaction SPROXY I get the message "No connection to Integration Builder (only local data visible)".
    I already did the configuration steps described in the XI Configuration Guide, section 8: Connecting Business Systems with an Integration Engine to the Central Integration Server.  I also checked that the parameters in the Exchange Profile were correct.
    Transaction SLDCHECK is working ok.  The TCP/IP connections SAPSLDAPI & LCRSAPRFC work OK when I do Test Connection.
    In transaction SPROXY, when I do Connection Test the first 2 steps work fine, but I get an error on the last two:
    1.  The address of the Integration Builder must be stored in the R/3
         System
         =>Check/maintain with report SPROX_CHECK_IFR_ADDRESS
    Result: OK: Address maintained
    2. The HTTP connection of the R/3 application server must function
    correctly
    =>Check with report SPROX_CHECK_HTTP_COMMUNICATION
    Result: HTTP communication functioning
    The following 2 steps return an error:
    3. The Integration Builder server must be running correctly
    =>Check with report SPROX_CHECK_IFR_RESPONSE
    Result: Integration Builder data not understood
    4. Proxy generation must correctly interpret the data of the
    Integration Builder
    ==>Check with report SPROX_CHECK_IFR_CONNECTION
    Result: Integration Builder data not understood
    I've been looking in the forums and weblogs but haven't been able to figure out what could be wrong with my system.   Can someone please help?
    Thanks and Regards,
    Carlos

    Carlos,
    Check Global Configuration data in R3  using TC : SXMB_ADM.
    You need to set values..Role of Business System : Application System and Corresponding Integration server details.
    Also check your HTTP RFC connection : INTEGRATION_DIRECTORY_HMI
    Nilesh
    Message was edited by:
            Nilesh Kshirsagar

  • How to get a EAR package for deploying without local data-source setting in JDev9i?

    Hi all,
    I want to use pooled data-sources defined in OC4J's global configuration setting directory in JDEV_HOME\j2ee\home\config.
    But when I packaged my application to an EAR file,it contains it's local data-sources.xml file in the package.Each time the server was restared,the local data-sources.xml file was overwritten by the originally setting in the EAR package,thus I cannot use the pooled settings.
    So could let me know how to package a EAR file without containing it's local data-source setting,but use the global data-source setting?Or how to control the local data-source setting without changed after mannully defined.
    Your reply is greatly appreciated!
    Regards,
    Robbin Woo

    Mike,
    You don't need a File to do what you want. Even if you could, it would be wrong to use a file.
    It would be wrong, because the compiler might not by using a JavaFileManager that is actually backed by a filesystem. It could be backed by some form of repository such as a database, or it could be backed by a transient in memory file system ( An example of the latter is the [annotation processor test framework|https://hickory.dev.java.net/nonav/apidocs/index.html?net/java/dev/hickory/testing/package-summary.html] in hickory)
    But probably of more interest to you, you don't need to because you can simply [get an InputStream|http://java.sun.com/javase/6/docs/api/javax/tools/FileObject.html#openInputStream()] from the FileObject returned from [filer.getResource(...)|http://java.sun.com/javase/6/docs/api/javax/annotation/processing/Filer.html#getResource(javax.tools.JavaFileManager.Location,%20java.lang.CharSequence,%20java.lang.CharSequence)] and read it.
    You might also find the [Compiler Tree API|http://java.sun.com/javase/6/docs/jdk/api/javac/tree/index.html] useful. The Compiler Tree API is available to annotation processors running in Oracle's (Sun's) javac, but not in other compilers (possibly limiting portability).
    With this API you can [convert |http://java.sun.com/javase/6/docs/jdk/api/javac/tree/com/sun/source/util/Trees.html#getPath(javax.lang.model.element.Element)] the Element representing the annotation to a Treepath then [get the leaf node|http://java.sun.com/javase/6/docs/jdk/api/javac/tree/com/sun/source/util/TreePath.html#getLeaf()] and from that tree [obtain its offsets|http://java.sun.com/javase/6/docs/jdk/api/javac/tree/com/sun/source/util/Trees.html#getSourcePositions()] within the source file.
    Bruce

  • Error : Can't open local data port

    Hello:
    I�d like present this problem to yours:
    I am executing two files Client/Server into my own PC, using two DOS console different, this one to Client an another to Server.
    I got this error:
    Output of the CLient (DOS Console of the client)
    C:\RUN\EX14>java AVReceive2 1.1.9.147/45300
    - Open RTP session for: addr: 1.1.9.147 port: 45300 ttl: 1
    - Waiting for RTP data to arrive...
    - Waiting for RTP data to arrive...
    - A new participant had just joined: mpalacios@mpv
    - Waiting for RTP data to arrive...
    - Waiting for RTP data to arrive...
    Output of the Server (DOS Console of the Server)
    C:\RUN\EX14>java AVTransmit2 file:/c:/run/format/au/drip.au 1.1.9.147 45300
    Track 0 is set to transmit as:
    ULAW/rtp, 8000.0 Hz, 8-bit, Mono, FrameSize=8 bits
    Error : Can't open local data port: 45300
    Well, 1.1.9.147 is the IP address of my PC.
    So, What�s the problem?
    If you need of the rest of code, please say me.
    Thanks.
    Angel.

    Well, I am agree with you, but I have this code in the Transmit File (Server):
    * Use the RTPManager API to create sessions for each media track of the processor.
    private String createTransmitter() {
         // Cheated. Should have checked the type.
         PushBufferDataSource pbds = (PushBufferDataSource)dataOutput;
         PushBufferStream pbss[] = pbds.getStreams();
         rtpMgrs = new RTPManager[pbss.length];
         SessionAddress localAddr, destAddr;
         InetAddress ipAddr;
         SendStream sendStream;
         int port;
         SourceDescription srcDesList[];
         for (int i = 0; i < pbss.length; i++) {
         try {
              rtpMgrs[i] = RTPManager.newInstance();     
    //The local session address will be created on the
    // same port as the the target port. This is necessary
    // if you use AVTransmit2 in conjunction with JMStudio.
    // JMStudio assumes - in a unicast session - that the
    // transmitter transmits from the same port it is receiving
    // on and sends RTCP Receiver Reports back to this port of
    // the transmitting host.
              port = portBase + 2*i;
              ipAddr = InetAddress.getByName(ipAddress);
              localAddr = new SessionAddress( InetAddress.getLocalHost(),
                                       port);
              destAddr = new SessionAddress( ipAddr,
                                  port);
              rtpMgrs.initialize( localAddr);
              rtpMgrs[i].addTarget( destAddr);
              System.err.println( "Created RTP session: " + ipAddress + " " + port);
              sendStream = rtpMgrs[i].createSendStream(dataOutput, i);          
              sendStream.start();
         } catch (Exception e) {
              return e.getMessage();
         return null;
    When I use this Files (Client/Server), between diffrent PCs into LAN, eg, Client (One PC) and Server (Another PC), but I used:
    This One PC: (IP Address PC = 1.1.9.150)
    C:\run\ex14\java AVTransmit2 file:/c:/run/format/audrip.au 1.1.9.147 45300
    Another PC:(IP Address PC = 1.1.9.147)
    C:\run\ex14\java AVReceive2 1.1.9.147/45300
    That�s work right!
    So, what�s happen?

  • What kind of data users are coping to their USB drives ..?

    We need monitor  what kind of data users are coping to their USB drives ..? do we have any scripts which can trigger the emails when the Data is copied to USB drives with File Name , size and location from where they are coping.
    Please advise    
    Kris

    If you want to prevent users from copying stuff to their USB drives the best way would be to force a group policy that prevents USB devices from working on local machines. There is no easy way of monitoring file transfer activity via script. What you appear
    to be looking for is software which monitors your employees.
    What you can do is list the USB Devices that have been attached to the system:
       Push-Location      
       Set-Location HKLM:\
       $devices = Get-Item 'HKLM:\SYSTEM\CurrentControlSet\Enum\USBSTOR\*\*'
       Get-ItemProperty $devices |   
          Select-Object -Property @{Name='SerialNumber';Expression={$_.PSChildName.TrimEnd('&0').split('&')[-1]}}, FriendlyName,Class
       Pop-Location 

  • RTP Cannot open local data port

    Hello,
    I'm about to learn JMF. So I tried the example code from the sun JMF documentation(example10-1; code is appended).
    The code compiles without errors but when I run it, I alway get the following error:
    streams is [Lcom.sun.media.multiplexer.RawBufferMux$RawBufferSourceStream;@b9b538 : 1
    sink: setOutputLocator rtp://192.168.0.6:49150/audio/1
    java.io.IOException: Can't open local data port: 4800
            at com.sun.media.datasink.rtp.Handler.open(Handler.java:139)
            at Send.<init>(Send.java:60)
            at Send.main(Send.java:71)
    What does that mean and what causes that kind of exception?
    Thanks in advance.
    MfG
    Michael
    ****Code starts here****
    class Send {
       public Send() {
          AudioFormat format = new AudioFormat(AudioFormat.LINEAR, 8000, 16, 1);
          Vector devices = CaptureDeviceManager.getDeviceList(null);
          System.out.println("DEBUG: CaptureDeviceManager returned "  + devices.size() + " devices");
          CaptureDeviceInfo di = null;
          Processor p = null;
          if(devices.size() > 0)
          di = (CaptureDeviceInfo) devices.elementAt(0);
          else
          System.exit(-1);
          System.out.println("Found: " + di.toString());
          try {
          p = Manager.createProcessor(di.getLocator());
          } catch(Exception e) {
         e.printStackTrace();
         System.exit(-1);
    p.configure();
    TrackControl tracks[] = p.getTrackControls();
    p.setContentDescriptor(new ContentDescriptor(ContentDescriptor.RAW));
    boolean encodingOk=false;
    for(int i=0; i < tracks.length; i++) {
         if(!encodingOk && tracks[i] instanceof FormatControl) {
         if( ( (FormatControl)tracks[i] ).setFormat(new AudioFormat(AudioFormat.GSM_RTP, 8000, 16, 1))==null )
         tracks.setEnabled(false);
         encodingOk = true;
         else
         tracks[i].setEnabled(false);
    } //for
    if(encodingOk) {
         p.realize();
         DataSource source = null;
         try { Thread.sleep(1000); } catch(InterruptedException e){}
         try {
         source = p.getDataOutput();
         } catch(NotRealizedError e) {
         System.out.println("*** ERROR: Cannot realize");
         //System.exit(-1);
         try {
         String url = "rtp://192.168.0.6:49150/audio/1";
         System.out.println(url);
         MediaLocator m = new MediaLocator(url);
         DataSink sink = Manager.createDataSink(source, m);
         sink.open();
         sink.start();
         } catch(Exception e) {
         System.out.println("*** ERROR: ");
         e.printStackTrace();
         System.exit(-1);
    } // if
    public static void main(String[] args) {
    new Send();

    Hello,
    I tried to change the port number to 22222, 66666 and 49150. With the same effekt. I found out that when I run the program for the first time after a reboot I get these errors:
    javax.media.NotConfiguredError: getTrackControls cannot be called before configured
    at com.sun.media.ProcessEngine.getTrackControls(ProcessEngine.java:285)
    at com.sun.media.MediaProcessor.getTrackControls(MediaProcessor.java:107)
    at Send.<init>(Send.java:29)
    at Send.main(Send.java:71)
    Exception in thread "main" javax.media.NotConfiguredError: getTrackControls cannot be called before configured
    at com.sun.media.ProcessEngine.getTrackControls(ProcessEngine.java:285)
    at com.sun.media.MediaProcessor.getTrackControls(MediaProcessor.java:107)
    at Send.<init>(Send.java:29)
    at Send.main(Send.java:71)
    I have to quit the program with Ctrl-c. The next time I run it I get the IOException as described above :( .
    MfG
    Michael

  • The creation of device local data objects - Unable to find generated object

    Hi all,
    I get an error when I try to generate an ESDMA containing a "Device Local" data object.
    I have created a "Device Local" data object containing a very simple node structure. The object is "active" and everything seems to work fine, until I generate the ESDMA. The following error is shown : Unable to find generated objects in repository.
    The esdma is generated without problems, when I remove the device local data object.
    I hope you have some ideas of what the issue might be.
    Best regards,
    Nima

    Hi Siva
    Actually i am working on this problem together with Nima and the problem is not related to SAP Basis objects.
    The problem is in SDOE_WB, with the SWCV. Creating a Data object of type LOCAL DEVICE and a NODE. If you add a field/s and then activate, and look in Generated objects The Table object is not generated (only the structure and the table type).  At this point there are no syntax/error messages.
    If we try to generate an esdma using the ESDMA design time, using this SWCV, it fails, with the message, Unable to find generated objects, in the repository. That is it is looking for the Table object that was not generated in sdoe_wb. This is only a problem with creating a data object of type DEVICE LOCAL, every other type of DO is fine.. I assume it is an SAP error.

  • What is Master Data Hormonization?

    Hi dear friends,
    1) What is Master Data Harmonization?
    2) please explain me with one good example? please...
    3) What are the SAP NetWeaver MDM  six core and business scenarios? Explain each with one example

    Hi,
    SAP provides six scenarios u2014 or approaches to working with SAP NetWeaver u2014 to help companies maintain their master data while at the same supporting a flexible, innovative IT environment. These are as follows:
    1. Master Data Consolidation u2014
    In this scenario, users use SAP NetWeaver MDM to collect master data from several systems at a central location, detect and clean up duplicate and identical objects, and manage the local object keys for cross-system communication. With this consolidated data, users can access the information they need to perform company-wide analyses and consolidated reporting.
    2. Master Data Harmonization---
    This scenario enhances the Master Data Consolidation scenario by forwarding the consolidated master data information to all connected, remote systems, thus depositing unified, high-quality data in heterogeneous system landscapes. With this scenario, you can synchronize central data contents u2014 that is, globally relevant data u2014 based on your results. For example, you can assign the same address to all occurrences of a particular customer.
    3. Central Master Data Management---
    Whereas the emphasis in the Master Data Harmonization scenario is on local data maintenance, the Central Master Data Management scenario focuses on cleansing data in the central data repository. It then distributes the cleansed data to the connected application systems.
    4. Rich Product Content Management---
    This scenario is intended for the product information management (PIM) market, as it offers many functions for managing product data and corresponding image files centrally via SAP NetWeaver MDM. It can also be used to prepare for publishing product catalogs, either in electronic Web format or in print.
    5. Customer Data Integration---
    The Customer Data Integration scenario lets you harmonize customer master data records across heterogeneous systems.
    6. Global Data Synchronization---
    Through Global Data Synchronization, unified object information is synchronized with data pools, such as UCCnet and Transora, in a standard industrial format, then provided
    to trading partners.
    Through these different ways of using the capabilities of SAP NetWeaver MDM, companies can structure a master data strategy around their data unification goals u2014 based on their unique business processes, organizational structure, or industry.
    Regards
    Richa

  • What is Master Data Consolidation?

    Hi Friends,
    what is Master Data Consolidation?
    please explain this procedure with one good example?
    please....

    Hi,
    Master Data Consolidation means:
    1. Load master data from client systems
    2. Identification and consolidation of similar or identical objects
    3. Provision of key mapping for unified, company wide analytics and     business operations.
    4. It aggregate master data across SAP and non- SAP systems into    a centralized master data repository. Once data is consolidated, you can search for data across linked systems, identify identical or similar objects across systems, and provide key mapping for reliable company wide analytics and reporting.
    In this scenario, these are the following steps:
    1. Users use SAP net weaver MDM to collect master data from several systems at a central location,
    2. Detect and clean up duplicate and identical objects, and manage the local object keys for cross-system communication.
    3.With this consolidated data, users can access the information they need to perform company-wide analysis and consolidated.
    For example: Suppose Wipro have different branches like: Wipro Technologies, Wipro Limited, Wipro Infotech and each branch is  doing business with the same customer. Each branch has its own name and is own id but overall it's a company called Wipro.
    So, now if you see all these records are duplicate and hence give inconsistencies when this data is distributed in different departments of an organization. So we consolidate this data by sending it to MDM where all the information is consolidated, updated and managed properly.
    Regards
    Richa

Maybe you are looking for