Data Utilization Tool Question

Does any one know what is the 'Technology' sub-category in the Web & Apps category of the new Data Utilization Tool?
On my account, there is one line's usage that is comprised of 30% Technology and Verizon as thus far been unable to be specific about what this means.  Because we have seen a tremendous spike in data usage, I'd like to know what function/program to turn off to avoid the excessive data usage, but no one can tell me what that would be.  Anyone??

baylakates, that is a wonderful question. I want to make sure that we are able to get you all the details that you are looking for. The Technology category includes applications and websites for technology companies building tethering technologies and websites offering content for software programmers. So this could be any 3rd party applications that were downloaded from the app stores.
KevinR_VZW
Follow us on Twitter @VZWSupport
If my response answered your question please click the "Correct Answer" button under my response. This ensures others can benefit from our conversation. Thanks in advance for your help with this!!

Similar Messages

  • Small data mart tools question

    sorry that I'm rather new to data olap!
    we have a small operational system that creates approximately 120K records a year in a single table with a couple of two level lookup tables. the time component is stored with the measure which is already aggregated to the desired "day" granularity.
    CategoryLevel1 --> CategoryLevel2 --> Measure with date <-- LocationLevel1 <-- LocationLevel2
    we want to target a "lightweight" BI design using Pentaho, Mondrian and Saiku against an Oracle database. if we need another schema then its ok to have that in the same database.
    we are considering simply using materialized views as fact and dimension tables for ETL as described here:
    http://ww1.ucmss.com/books/LFS/CSREA2006/IKE4645.pdf
    is this a common approach? are there any drawbacks that are of significance for our effort?
    appreciate any insight you can provide.

    I am not sure if this will help you, but there is a nice white paper on how the Oracle database OLAP option can be used at http://www.oracle.com/technetwork/database/options/olap/oracle-olap-11gr2-twp-132055.pdf.
    Other OLAP collateral can be found at Oracle OLAP.
    --Ken Chin

  • RE: DataField, update underlying data via TOOL,Express

    John,
    does it work if you mix the "CopyfromClipboard" method with the "PlaceValueinDisplayedField" ?
    If this is not the correct solution to you problem, could you please specify "where" it does not work ?
    Thanks a lot indeed.
    Best regards
    /Stefano
    Stefano POGLIANI Fort&eacute; Software Consultant
    E-Mail : [email protected] Tel : +33.0)450201025
    Fax : +33.(0)450200257 Mobile : +33.(6)08431221
    Visit the Fort&eacute; Web Site : http://www.forte.com/
    Ducunt fata volentem, nolentem trahunt....
    -----Original Message-----
    From: John Hodgson [SMTP:[email protected]]
    Sent: Wednesday, July 02, 1997 8:39 PM
    To: [email protected]
    Subject: DataField, update underlying data via TOOL, Express
    In TOOL code we PasteText() into a DataField, but the underlying data
    object does not get updated until the user interacts with the GUI.
    That causes problems if we need to use the underlying data object's
    value immediately after the paste. How can we:
    force an update of the underlying data object and
    ensure that the update goes through before our method call returns,
    i.e., ensure that if the update is via Forte events, those events
    are handled before returning.
    The context is a calendar lookup button that pastes into an adjoining
    DataField.
    John Hodgson |Descartes Systems Group Inc.|[email protected]
    Systems Engineer|120 Randall Drive |http://www.descartes.com
    |Waterloo, Ontario |Tel.: 519-746-8110 x250
    |CANADA N2V 1C6 |Fax: 519-747-0082

    Well, I think I have answered my own question, but I will leave it here in case anyone else has the same problem. 
    So, as far as I have been able to track down, it all went wrong when I was running through the connection wizard. 
    Under the section titled "Creating the Data Source" is describes how to find your database file and create the appropriate connection string. However, on my version of VS Express 2010 . It offered me a prompts , saying something like, "would you like to
    move this database file into the application directory and change the connection string" this sounded very sensible to me, so I said yes.
    All proceeded accordingly. And the database file now appeared in the solution explorer. 
    The app config file said that the connection string was 
    Data Source=|DataDirectory|\Database1.sdf"
    I presumed this would be interpreted correctly by the rest of the app, as it was generated by VS.
    But it didn't, what I cannot understand is how no error was generated. And data seemed to pull
    into the bound controls. 
    But I have been testing it for a while now. and it seems that if I manually override the config file with the actual directory where the file exists , then there is not a problem. Data is retained in the file 
    This is more of a VB.net question, but I couldn't find it in the drop down. I will try and move it there now
    Thanks guys for your patience. 
    p.s. RSingh , the code I posted above did come from the SaveItem_Click event
    handler

  • Data Warehouse Partitioning question

    Hi All,
    I have a data warehousing partitioning question - I am defining partitions on a fact table in OWB and have range partitioning on a contract number field. Because I am on 10gR2 still, I have to put the contract number field into the fact table from its dimension in order to partition on it.
    The tables look like
    Contract_Dim (dimension_key, contract_no, ...)
    Contract_Fact(Contract_Dim, measure1,measure2, contract_no)
    So my question:
    When querying via reporting tools, my users are specifying contract_no conditions on the dimension object and joining into the contract_fact via the dimension_key->Contract_dim fields.
    I am assuming that the queries will not use partition pruning unless I put the contract_fact.contract_no into the query somehow. Is this true?
    If so, how can I 'hide' that additional step from my end-users? I want them to specify contract numbers on the dimension and have the query optimizer be smart enough to use partition pruning when running the query.
    I hope this makes sense.
    Thanks,
    Mike

    I am about to start a partitioning program on my dimension / fact tables and was hoping to see some responses to this thread.
    I suggest that you partition the tables on the dimension key, not any attribute. You could partition both fact and dimension tables by the same rule. Hash partitions seem to make sense here, as opposed to range or list partitions.
    tck

  • Web-based data visualization tools

    Hi.  I originally posted this question to the LV forum, but I am going to modify my question because I think Diadem may be a good solution for us.
    I am embarking on a interesting research project that will involve recording lots of physiological signals from patient monitors in a hospital.  I will be recording signals like BP, spO2, Pulserate, ECG, etc.
    Right now, my main design concern is how to give clinicians access to the data after it is recorded.  I think Diadem might be a good solution, but I really want it to be web-based.
    Here is my wish list:
    - Store the data securely, where access can be restricted and controlled (e.g. user must login)
    - Allow users (who are granted access) to login and view historical data via a simple web browser
    - Allow users to pick datasets amd zoom, scroll, etc. 
    - Allow user to view data in stacked plots (e.g. so they can see multiple signals simultaneously on the screen and scroll the time axis)
    - Allow user to annotate data (e.g. add comments at specific timestamps)
    Do you have any reccomendations for a web-based data visualization tool like this?  Do you think Diadem meet my needs?

    Hello J Osborne,
    You application sounds like a great fit for DIAdem and the DataFinder, with the exception of the Web-based nature of the application. We currently do not offer a native web interface to DIAdem and it's features, but there are ways to accomplish that using established technologies such as remote clients ...
    Here are my comments to your individual questions:
    - Store the data securely, where access can be restricted and controlled (e.g. user must login)
    * The DataFinder Server (which works with DIAdem) technology can be used to give specific users access to only data they are supposed to see, so we can cover this part of your application
    - Allow users (who are granted access) to login and view historical data via a simple web browser
    * This would have to be achieved via a remote client solution, since DIAdem offers no native Web client
    - Allow users to pick datasets amd zoom, scroll, etc. 
    * This is something that the DIAdem NAVIGATOR and DIAdem VIEW are very good at, including the ability to search for data sets based on specific key words, patient names, etc. and the ability to look at multiple data sets simultaneously in multiple windows with zooming and scrolling.
    - Allow user to view data in stacked plots (e.g. so they can see multiple signals simultaneously on the screen and scroll the time axis)
    * That's DIAdem VIEW, sounds like a perfect fit: http://zone.ni.com/devzone/cda/tut/p/id/7384
    - Allow user to annotate data (e.g. add comments at specific timestamps)
    * DIAdem VIEW has the ability to highlight events with cursors, and we could easily add the ability to add comments through a custom dialog and a short script. We could prepare and demo this feature for you ...
    I would be happy to discuss your application with you in more detail. Feel free to contact me at (800) 531-5066 (ask for Otmar Foehner) or via email at    otmar DOT foehner AT ni DOT com.
    Otmar D. Foehner
    Business Development Manager
    DIAdem and Test Data Management
    National Instruments
    Austin, TX - USA
    "For an optimist the glass is half full, for a pessimist it's half empty, and for an engineer is twice bigger than necessary."

  • Regarding Data Maintenance Tool

    Hi ABAPers,
    Can any one Explain me what is the use of Data Maintenance Tool. And also let me know where and how to create this tool.
    Thanks in Advance.
    Regards,
    Ramana Prasad. T

    Hi Ramana,
    <b>
    Mass Maintenance to SAP Data – Current Situation</b>
    If only a few tens of records are to be added or updated, entering this data manually into SAP is a reasonable option. However, if several hundred to several thousand records are to be updated, manual data entry is not an ideal choice. Manual data entry can get very expensive and time consuming and is not the best utilization of corporate resources – this is especially wasteful for data that already exists in another digital format, e.g., Excel files. Not only that, manual data entry is very error prone and can lead to severely degraded data quality.
    The SAP mass-change transactions are only useful for fairly simple data change scenarios where one or more fields need to be changed to a constant value for all records. Most mass changes scenarios are more complex than that and the second and third options are the ones mostly chosen.
    The other tools provided by SAP for automating mass maintenance of data, include tools such as ABAP and LSMW, which are extremely technical in nature and require technical experts in the IT departments to create special-purpose data upload programs. Following the standard change control procedures and industry standard software development practices, writing these special programs ends up being a very time consuming and a very expensive proposition for most companies. Further, many of these programs are throw-away programs written for one-time use only.
    <b>
    Mass Maintenance to SAP Data– Ideal Scenario</b>
    An ideal tool for mass maintenance of SAP data would let the data update projects to be implemented by the end user departments themselves. A few super-users within these end-user departments that supply the data should have the ability to transfer the data themselves without relying on technical experts. Such a tool would significantly cut the time and effort required in SAP data maintenance.
    The ideal tool for mass maintenance of SAP data should be easy to learn and should require no programming. Furthermore, such a tool should work across all the different SAP modules and the SAP products, including the different versions of SAP
    WE PROPOSE TXSHUTTLE AND TABLEPRO AS THE IDEAL TOOLS FOR ROUTINE SAP DATA MAINTENANCE!
    Key Features:
         TxShuttle runs almost any SAP transaction, simple or complex, and even custom transactions, in any SAP product or module – from outside SAP. While running these transactions, TxShuttle allows easy upload of data between Microsoft Excel or Access and any SAP transaction.
      •      TablePro allows the easy download of any table or view from SAP into Excel files and even allows the joining of multiple tables.
         Txshuttle and TablePro are easy-to-use tools intended for SAP end-users and do not require any programming.
         TxShuttle and TablePro preserves complete SAP role-based security and maintain a complete SAP audit trail.
         Using TxShuttle and TablePro needs no change to standard SAP – they are desktop software tools that run out of the box
    Key Benefits:
         Save vast amount of time and resources in mass maintenance of SAP data.
         Make the SAP business-users and functional analysis more self-sufficient by reducing their dependence on valuable IT/IS resources.
         Save IT resources by significantly reducing custom programming efforts.
         Use the same tool for many applications. Maximize your return on investment.
    Reward if useful.
    Thanks
    Aneesh.

  • ODI-1241: Oracle Data Integrator tool execution fails.

    Hi
    I'm getting the following error while running the OdiOSCommand command. I'm running dos2unix command to convert text files from dos to unix format.
    Application tier is on a different host to the ODI setup. Getting the following error. Please help resolve this issue.
    Error : ODI-1226: Step OdiOSCommand fails after 1 attempt(s).
    ODI-1241: Oracle Data Integrator tool execution fails.
    Caused By: com.sunopsis.dwg.function.SnpsFunctionBaseException: ODI-30038: OS command returned 1.

    The issue was with the value set for the OUTPUT_DIR variable. It was pointing to the wrong location.
    After setting it correctly the package completed successfully.
    Thanks for all your replies.
    To anwser your question. We are finding junk data and need to run the command to remove them from the input files which are coming from a different source.
    Edited by: user761125 on Jun 3, 2012 11:38 PM

  • Using the "Data Merge Tool"

    Hi everyone,
    I have a 2 questions about the data merge tool. I understand how to use a CSV file to replace dynamically any text I desire.
    1. Can I use this same tool to replace pictures? Say, like if I create baseball cards for an entire team.
    2. If I use it for creating business cards, but I only need to create ONE card at a time... I like how the "Records per document" option that optimizes the page size to fit as many as possible. Ok, to the question; How can I print one record at a time, multiple times (say 20 per page) and include art for the other side of the page (double side print, that does not contain variable data, just a background)? I just need to make sure the tool knows how to line the art up, so that when it is cut, the front and back art is aligned properly.
    Thank you for your time.
    Rafa.

    Hi Lori
    Many thanks the binding was set to 'none'.
    Regards
    Dave

  • Reccomendations for web-based data visualization tools?

    Hello again,
    I am embarking on a interesting research project that will involve recording lots of physiological signals from patient monitors in a hospital.  I will be recording signals like BP, spO2, Pulserate, ECG, etc.
    Right now, my main design concern is how to give clinicians access to the data after it is recorded.
    My wish list is:
    - Store the data securely, where access can be controlled (e.g. perhaps in a RDBMS like SQL Server)
    - Allow users (who are granted access) to login and view historical data via a simple web browser
    - Allow users to zoom, scroll, etc. 
    - Allow user to view data in stacked plots (e.g. so they can see ECG, BP and HR simultaneously on the screen and scroll the time axis)
    - Allow user to annotate data (e.g. add comments at specific timestamps)
    Anybody have any reccomendations for a web-based data visualization tool like this?
    I know that people are probably going to reccomend Diadem ... but I don't think its web-based is it?

    There is actually a new thing available called LabVIEW Web UI Builder.  This allows you develop web-based applications that have a custom GUI that can interface with other LabVIEW applications.  You can find more information on the Web UI Builder page on ni.com. There is also a dedicated forum that you can use to post questions about the Web UI Builder.  That forum is located here.  I hope that you find this helpful!
    Brandon Treece
    Applications Engineer
    National Instruments

  • What are the options for me for Data Mining tools?

    I am new for data mining. I have a project for which we are asked to identify the relationships between variables we got and we were asked to come up with an algorithm from data mining. What are the options available for me as data mining tools and where
    do I start from? How do I travel through this process. Please advice.
    mayooran99

    I think that Predixion Insight is the easiest tool to use and is very powerful.
    All you need to have is Windows OS, Office 2010 or 2013 and internet connection. If you want to look at what you need to do before installing Predixion, I recommend you to take a look at the walkthroughs at
    http://predixionsoftware.com/Help/webframe.html#Data%20Modeling%20Walkthroughs.html
    To install Predixion, go to
    http://predixionsoftware.com/Technology/Predixion-Insight-Download
    Tatyana Yakushev [PredixionSoftware.com]

  • Data Services as a data migration tool

    Hello All,
    Has anybody used Data Services as a data migrition tool from a SAP ECC system?  If so what is it like to use and are there any documents on this?
    Cheers,
    Nick.

    Hi Nick.
    About documents, here are some of them, but one that I think it could be useful for your understanding is the "BI109 SAP Data Migration" session: http://www.sdn.sap.com/irj/scn/shop?rid=/media/uuid/c08b931e-2a83-2c10-2aba-cb3968c5bc4e.
    It shows the data migration framework based on SAP BusinessObjects technologies for legacy to SAP and SAP to SAP migrations.
    More technical documents, showing how to set up iDocs, LSMW, etc:
    http://help.sap.com/bp_dmg603v1/DMS_US/Documentation/DM_installation_guide_EN_US.doc
    http://help.sap.com/bp_dmg603/DMS_US/Documentation/DM_Quick_Guide_EN_US.doc
    Regards,
    July

  • Where to download EH&S Data Editor tool

    Hi,
    I know EH&S Data Editor tool is not more available to download from TechniData ftp site. Do you know where can I download EH&S Data Editor tool in SAP Service Marketplace or TechniData site?
    Thanks,
    KC

    Mark,
    Somebody of my team would like to use this tool within the scope of a PLM project (for initial data load).
    Of course I will contact my SAP service account manager but it should be useful to explain to the community (do not need all thedetails) why this product is not sold anymore (after Technidata acquisition).
    Does it mean that there is a similar, equivalent solution within the SAP product portfolio (Business Objects) area for example ?
    Thanks beforehand.

  • Issue in Data Admin Tool Kit

    Hi,
    I am modifying the Signature email body under workflow in Cutom Translations in Data Admin Tool Kit. When I am trying to write in new line, I am not able to write.
    I used HTML tags like \n but then also it is not writing in to new line.It is coming as below.
    Action needed at the following status\n<b>INPUTand REVIEW</b>\n Data Owner can enter content.Refer to the Ownership Grid on the Helix SharePoint for field ownership and instructions.\nData Reviewers can review and sign off to approve the spec.\n\n<b>PLANT TRIAL</b>\nData Owner can Modify, Update, Review owned content.\nData Reviewers can review and sign off to approve.
    Please help and let me know how to write the text into new line in Override translations under Custom translations .

    What is the sql you are using? Is this Sql Server or Oracle?
    You can try a double back slash like this
    Action needed at the following status\\n<b>INPUTand REVIEW</b>\\n Data Owner can enter content.Refer to the Ownership Grid on the Helix SharePoint for field ownership and instructions.\\nData Reviewers can review and sign off to approve the spec.\\n\\n<b>PLANT TRIAL</b>\\nData Owner can Modify, Update, Review owned content.\\nData Reviewers can review and sign off to approve.
    If it is sql server Ive used the following in the past with SQL Server Studio, and just included the new line/Carriage return in the editor and it worked:
    INSERT INTO TysonTranslationItemValues (pkid, fkTranslationItem, langValue, langID) SELECT '2167d3fa954c-2ab3-4e7e-bb06-5bda86eba661', '2166b8a82f69-b243-4d2b-a380-ffdd542e01d6', N'Attention: @PERSONTOSIGNOFF
    Your ''@FUNCAREA'' signature has been requested for the activity ''@TITLE.'' (on project ''@PROJECT''). Please review the document and use the ''Workflow'' signpost icon in the upper toolbar to submit your approval (or return with comments) once you have reviewed the supporting documents:
    @URL', 0 FROM DUAL;
    Hope that helps.
    --Trey

  • Data transformation tool in wli8.1

    hi,
    We are trying to explore the data transformation tool of the wil8.1.
    We have cobol copy books as input and we need to end up populating oracle tables.
    These are the steps we are following:
    1. Use format builder to change cobol copy books into .mfl files
    2. Import them into the existing schema.
    3.. Create a schema(.xsd) which would take the output which then could be transferd
    into the java variables while creating the .jcx file.
    3. Create a .dtf file that would take in the .mfl as the input and the output
    would be the .xsd file created in step 3.
    4. Create a .jpd file that would be initiated with the client request and would
    use the .dtf as an input in the control send node and will return the .xsd file
    creted in the step3.
    The xsd file we have created is:
    <?xml version="1.0"?>
    <xs:schema
         xmlns:xs="http://www.w3.org/2001/XMLSchema"
         xmlns:tns="http://www.bea.com/TransformationWeb/part1xform.xsd"
         targetNamespace="http://www.bea.com/TransformationWeb/part1xform.xsd">
    <xs:element name="part1xform"
    type="tns:part1xform"/>
    <xs:complexType name="part1xform">
    <xs:sequence>
    <xs:element name="part1xform-rct_nbr" type="xs:integer"/>
    <xs:element name="part1xform-fac_nbr" type="xs:integer"/>
    <xs:element name="part1xform-rct_suff" type="xs:string"/>
    </xs:sequence>
    </xs:complexType>
         </xs:schema>
    We have not as yet gone to the stage where we can create a .jcx file using the
    output of the first stage. The error we get while trying to run the .jpd file
    in the last step is:
    An unexpected exception occurred while attempting to locate the run-time information
    for this Web Service. Error: java.lang.NoClassDefFoundError:com/bea/transformationWeb/part1Xform/Part1XformDocument
    Any help you coulde provide in this would be really appriciated.
    However, your suggestions on some other method to be followed for data tranformation
    from cobol copybooks to oracle tables are also welcome.
    Divya Ravishankar
    Advance Computer Services Ltd.
    Millennium Business Park, Mhape,
    Navi Mumbai-400703. Phone: 27782805/6/7.

    Can you please attach that cobol copy book so that i can test it?
    "divya" <[email protected]> wrote:
    >
    hi,
    We are trying to explore the data transformation tool of the wil8.1.
    We have cobol copy books as input and we need to end up populating oracle
    tables.
    These are the steps we are following:
    1. Use format builder to change cobol copy books into .mfl files
    2. Import them into the existing schema.
    3.. Create a schema(.xsd) which would take the output which then could
    be transferd
    into the java variables while creating the .jcx file.
    3. Create a .dtf file that would take in the .mfl as the input and the
    output
    would be the .xsd file created in step 3.
    4. Create a .jpd file that would be initiated with the client request
    and would
    use the .dtf as an input in the control send node and will return the
    .xsd file
    creted in the step3.
    The xsd file we have created is:
    <?xml version="1.0"?>
    <xs:schema
         xmlns:xs="http://www.w3.org/2001/XMLSchema"
         xmlns:tns="http://www.bea.com/TransformationWeb/part1xform.xsd"
         targetNamespace="http://www.bea.com/TransformationWeb/part1xform.xsd">
    <xs:element name="part1xform"
    type="tns:part1xform"/>
    <xs:complexType name="part1xform">
    <xs:sequence>
    <xs:element name="part1xform-rct_nbr" type="xs:integer"/>
    <xs:element name="part1xform-fac_nbr" type="xs:integer"/>
    <xs:element name="part1xform-rct_suff" type="xs:string"/>
    </xs:sequence>
    </xs:complexType>
         </xs:schema>
    We have not as yet gone to the stage where we can create a .jcx file
    using the
    output of the first stage. The error we get while trying to run the .jpd
    file
    in the last step is:
    An unexpected exception occurred while attempting to locate the run-time
    information
    for this Web Service. Error: java.lang.NoClassDefFoundError:com/bea/transformationWeb/part1Xform/Part1XformDocument
    Any help you coulde provide in this would be really appriciated.
    However, your suggestions on some other method to be followed for data
    tranformation
    from cobol copybooks to oracle tables are also welcome.
    Divya Ravishankar
    Advance Computer Services Ltd.
    Millennium Business Park, Mhape,
    Navi Mumbai-400703. Phone: 27782805/6/7.

  • SAP Master data migration tools

    Hi,
    I would like to know if any SAP standard tools which are available for all master data migration,Kindly share the same which is required for us now.
    We have to migrate the data from legacy systems to SAP and we have to use only SAP Standard master data migration tools.
    Kindly share the same.
    Thanks and Regards,
    Raveendra

    Raveendra,
    SAP migrates data from legacy system using standard tools like LSMW, BDC, BAPI. Under LSMW you will have Batch input, Batch recording, BAPI and IDOCs options. Depending upon requirement you can choose any one of them. BAPI is advisable instead of BDC method.
    Also for utilities industry SAP has provided ISU Migration tool (EMIGALL).

Maybe you are looking for

  • Problem Please help me.....Let me know the area to look for it

    Problem Please help me.....Let me know the area to look for it I am a DBA..Thanks in advance to let me know the cause and the area to look and fix it... Server Error in '/' Application. Problem with SAP/BAPI. SAP.Connector.RfcCommunicationException:

  • Formula interpreter logical operator

    Dasylab 10 I'm capturing voltage from a daq, using formula interpreter to make calculations and then generating a XY graph. As the values are being generated I am wanting to display in a digital display the Y peak value and the corresponding X value.

  • Videoplayer sometimes not responding on IPad3?

    I've noticed that sometimes the videoplayer doesn't respond to display the screen menu to stop/forward, show timeline etc. Often I must stop and replay a movie due to the wish of my little daughter. I try to get the screen menu but fail. After hittin

  • Installing Application Server 10g 2 on Unbreakable linux

    Hello all, I wish to install Application Server 10g 2 on Oracle Unbreakable Linux EL5. Could someone please let me know if there are installation instructions for the App Server on EL5? Also, when I go to [http://www.oracle.com/technology/software/pr

  • Profit cener 9999 does not exist for 31.01.2010

    I am getting this error when i am doing confirmation.. that Profit center XXXX/9999 does not exist for 31.01.2010. If i check in KE52 the profit center 9999 is not found.neither in KE53.. If that profit center is not there in the system,then why is t