How to findout what note to be applied?

Hello All,
      In my systems(R/3 and CRM) there are lots of ABAP dumps (used ST22). How do I finout what notes to be applied from those dumps? I am new to reading ABAP dumps. Please help to find out what notes to be applied,
Is there any procedure?
Regards
N.S

To search for a note you need to find out the keywords which is shown in shortdump->How to correct the error section.
Then search the notes in service.sap.com site to list the notes available.
One important thing is for you to know your system version and on which support package you are in. Then you need to identify relevant note.
You will also have detials in note as to which systems it is applicable.
You can use SNOTE Tcode to apply for notes, can also be applied manually which is not recommended.
Regds
Manohar

Similar Messages

  • After all working, how to findout if all the changes applied to target?

    I finally got extracts/datapump/replicat working.
    I have a question though, how do I know if the target sync up with the source db?
    Do I have to use goldengate veradata? How to install and configure veridata?
    Thanks in advance.

    Data validation is nontrivial in logical replication products, including GoldenGate. With a physical standby database's data blocks are physically identical to the source database (and thus simple to validate). But if it's GoldenGate running SQL statements against the target, the physical data will be different. So data validation involves checking all the rows of any replicated table to make sure that all the data is indeed the same.
    Veridata is one way of doing this, but there are others, starting with simple SQL scripts that do count(distinct)s, or calculate hash values and compare them.
    A Google search for [oracle data comparison tool|https://www.google.com/search?q=data+comparison+too] shows a number of tools that do much the same thing.
    Marc

  • How to tell what template is applied to a Framemaker file?

    I'm new to Framemaker...so please excuse if this is a silly question.
    We are trying to figure out how to tell what template has been applied to a Framemaker file. We are using the structured view in Framemaker 7.2.
    Perhaps I am thinking too much like a Microsoft Word user -- in which case I can look at what template has been applied to a document.
    Does Framemaker work the same way -- or are templates associated with Framemaker files in some other way?
    Thank you --
    J-Ha

    J-Ha,
    FrameMaker works differently. Strictly speaking, there is no way to tell which template was used last. A FrameMaker template is just a FrameMaker document which you call "template". Using File > Import > Formats you can import all or selected formats from one document to another. The main benefit: Each document is completely self-contained and does not require any other document.
    Many solutions I know of include a user variable named something like "template-version" into their template document or put the version information on one of the reference pages. When importing user variables or reference pages, this can tell you version imported last.
    - Michael

  • HT4847 How can I change my iCloud email address without losing all my data and what not???? Please help...

    How can I change my iCloud email address without losing all my data and what not???? Please help...

    Welcome to the Apple Community.
    You can't change it but you can create an alias to use.

  • Hi, how to load mac without running windows (bootcamp)? The problem is this: the last time i ran the windows, but he hangs and died (can not run further  the logo). now i can not load not windows, not maс! how to be, what to do???

    The problem is this: the last time i ran the windows, but he hangs and died (can not run further  the logo). now i can not load not windows, not maс! how to be, what to do???

    can any body help me, plz!!

  • How can i take notes, index and retrieve the notes on what i read?  I read and can't remember the source.

    How can I take notes on what I read, then index/catalog them for search and retrieval on my macbook air or on my iPad?

    Try this one http://gigaom.com/apple/reset-os-x-password-without-an-os-x-cd/

  • Video filters not working when applied to sequence clips

    I have looked through the forums and haven't found the solution to my problem. I am having problems with video filters on clips in the timeline. I have footage that needs color correction. In the viewer I created a 3 way filter for one clip, saved it, and applied to all clips in the browser. They all open corrected in the viewer. (I did this because i was reviewing the clips with my client).
    When I put the clip in the timeline it does not indicate it needs a render, the filter does not appear on the clip, but when I double click and open the sequence clip in the viewer the color correction tab is there. I've removed attributes and tried again several times, both using the menu with the clip selected, and by doulbleclicking the clip into the viewer and apply the filter there. The tab is always there but the filter does not appear on the clip. It only appears when the process is done to clips from the browser inthe viewer. I've checked the RT for unlimted, I've done a render all video, I've checked settings under the render tab and filters is checked. I've gone into another project and have the same problem in a different project with different clips. I had a recent project where I used a blur effect and it worked fine. My video is DV and my sequence settings are DV NTSC.
    I've run out of my ideas and comments from ither threads. What is my next troubleshoot? Should I delete preferecnes? I haven't done that with FCP, can I do it myself or use Rescue? Could that create other problems for me? Any other ideas?
    I work in Afghanistan and have no resources here for any Mac support besides books and forums. 
    Thanks

    Hey Catrina
    Not sure what the issue is with your application of the 3-Way Color Corrector in the Browser. My guess, is that perhaps you only thought you applied the "saved" instance of the 3-way corrector but instead applied a default neutral instance of the corrector.  How did you save the correction you applied to the other instances?  A good method is, having applied and tweaked it as needed to a sample clip instance, you can drag a copy of it out of the clip's Filter tab and drop it directly into your Browser window.  To check, then double click that Browser instance of the corrector to open it in the Viewer window and see if has the corrections settings as expected and no the default ones. If so, go ahead and drag it to another clip instance in the Browser and then drag that clip to your sequence ... has the correction "stuck" bow? But while we figure it out, for now, why not just apply the saved correction directly to the clips in the sequence?  Regarding trashing your preferences ... download Preference Manager from Digital Rebellion.  Very straightforward to use.
    Cheers
    Andy

  • What are the steps applying incremental backups to standby database 11g

    Hi All,
    I have built 11g none ASM standby database from ASM RAC Database. Now I want to apply incremental backup to the standby database from primary but not sure how to do it. I tried following and I had an error “ORA-01103: database name 'ins-prim' in control file is not 'ins-sec'”
    1- I have configured standby database with RMAN backup.
    2- After finishing installation, I took a incremental backup from primary server(ins-prim) and moved incremental backup and control file to the standby (ins-sec) database
    3- I stared standby database nomount mode
    4- restore controlfile from “incremental backup location in standby database”
    5- alter database mount; and got this error
    “ORA-01103: database name 'ins-prim' in control file is not 'ins-sec'”
    What are the steps applying incremental backups to standby database with 11g?
    Thank you

    I build the database from backup and changed from ASM to none ASM and changed location of data files and logfiles. I think this changes makes the standby database as logical one.
    You can a have a physical standby with different locations for everything (redo/controlfiles/datafiles), ASM and no ASM etc. I have a such a configuration in production (10gR2)
    I build the database from backup
    Are you sure you have a standby ? Ins-sec receives the archivelog files from the primary ? How did you proceed to build this database ? I suspect you don't have a standby at all ! If you have duplicated the database ins-sec and ins-pri are independent databases and you won't be able to apply an incremental backup (your script was not correct but it is another story)

  • How convert to mobile...eg apply touch where as exist click event(mainly for javascript or jQuery, w

    I built a tinyStickyNotes App and works in PC browsers, how convert to mobile...eg apply touch where as exist click event(mainly for javascript or jQuery, well needed develop two versions of my App  mobi/PC?)?
    Really needed this or phone gap do the job?
    a phone gap book includes what needed change (in this case) or look for a ...?... Book?
    Also, not needed registered as phone gap user 9.99/mo if I am adobe creative cloud member, that's correct? yes correct / no needed
    Cloud file drive(file explorer win or mac), may installed and in my computers not have creative cloud apps on them?

    absolutely. Being that there will be no touch events via the mouse and mouse events via the touch screen, the two should not conflict with one another.  In the future css specifications, there will be something called touch points which will handle the events no matter the input.

  • What is BI ? How we implement & what is the cost to implement ?

    What is BI ? How we implement & what is the cost to implement ?
    Thanks,
    Sumit.

    Hi Sumit,
                        Below is the description according to ur query
    Business Intelligence is a process for increasing the competitive advantage of a business by intelligent use of available data in decision making. This process is pictured below.
    The five key stages of Business Intelligence:
    1.     Data Sourcing
    2.     Data Analysis
    3.     Situation Awareness
    4.     Risk Assessment
    5.     Decision Support
    Data sourcing
    Business Intelligence is about extracting information from multiple sources of data. The data might be: text documents - e.g. memos or reports or email messages; photographs and images; sounds; formatted tables; web pages and URL lists. The key to data sourcing is to obtain the information in electronic form. So typical sources of data might include: scanners; digital cameras; database queries; web searches; computer file access; etcetera.
    Data analysis
    Business Intelligence is about synthesizing useful knowledge from collections of data. It is about estimating current trends, integrating and summarising disparate information, validating models of understanding, and predicting missing information or future trends. This process of data analysis is also called data mining or knowledge discovery. Typical analysis tools might use:-
    u2022     probability theory - e.g. classification, clustering and Bayesian networks; 
    u2022     statistical methods - e.g. regression; 
    u2022     operations research - e.g. queuing and scheduling; 
    u2022     artificial intelligence - e.g. neural networks and fuzzy logic.
    Situation awareness
    Business Intelligence is about filtering out irrelevant information, and setting the remaining information in the context of the business and its environment. The user needs the key items of information relevant to his or her needs, and summaries that are syntheses of all the relevant data (market forces, government policy etc.).  Situation awareness is the grasp of  the context in which to understand and make decisions.  Algorithms for situation assessment provide such syntheses automatically.
    Risk assessment
    Business Intelligence is about discovering what plausible actions might be taken, or decisions made, at different times. It is about helping you weigh up the current and future risk, cost or benefit of taking one action over another, or making one decision versus another. It is about inferring and summarising your best options or choices.
    Decision support
    Business Intelligence is about using information wisely.  It aims to provide warning you of important events, such as takeovers, market changes, and poor staff performance, so that you can take preventative steps. It seeks to help you analyse and make better business decisions, to improve sales or customer satisfaction or staff morale. It presents the information you need, when you need it.
    This section describes how we are using extraction, transformation and loading (ETL) processes and a data warehouse architecture to build our enterprise-wide data warehouse in incremental project steps. Before an enterprise-wide data warehouse could be delivered, an integrated architecture and a companion implementation methodology needed to be adopted. A productive and flexible tool set was also required to support ETL processes and the data warehouse architecture in a production service environment. The resulting data warehouse architecture has the following four principal components:
    u2022 Data Sources
    u2022 Data Warehouses
    u2022 Data Marts
    u2022 Publication Services
    ETL processing occurs between data sources and the data warehouse, between the data warehouse and data marts and may also be used within the data warehouse and data marts.
    Data Sources
    The university has a multitude of data sources residing in different Data Base Management System (DBMS) tables and non-DBMS data sets. To ensure that all relevant data source candidates were identified, a physical inventory and logical inventory was conducted. The compilation of these inventories ensures that we have an enterprise-wide view of the university data resource.
    The physical inventory was comprised of a review of DBMS cataloged tables as well as data sets used by business processes. These data sets had been identified through developing the enterprise-wide information needs model.
    3
    SUGI 30 Focus Session
    The logical inventory was constructed from u201Cbrain-stormingu201D sessions which focused on common key business terms which must be referenced when articulating the institutionu2019s vision and mission (strategic direction, goals, strategies, objectives and activities). Once the primary terms were identified, they were organized into directories such as u201CProjectu201D, u201CLocationu201D, u201CAcademic Entityu201D, u201CUniversity Personu201D, u201CBudget Envelopeu201D etc. Relationships were identified by recognizing u201Cnatural linkagesu201D within and among directories, and the u201Cdrill-downsu201D and u201Croll-upsu201D that were required to support u201Creport byu201D and u201Creport onu201D information hierarchies. This exercise allowed the directories to be sub-divided into hierarchies of business terms which were useful for presentation and validation purposes.
    We called this important deliverable the u201CConceptual Data Modelu201D (CDM) and it was used as the consolidated conceptual (paper) view of all of the Universityu2019s diverse data sources. The CDM was then subjected to a university-wide consultative process to solicit feedback and communicate to the university community that this model would be adopted by the Business Intelligence (BI) project as a governance model in managing the incremental development of its enterprise-wide data warehousing project.
    Data Warehouse
    This component of our data warehouse architecture (DWA) is used to supply quality data to the many different data marts in a flexible, consistent and cohesive manner. It is a u2018landing zoneu2019 for inbound data sources and an organizational and re-structuring area for implementing data, information and statistical modeling. This is where business rules which measure and enforce data quality standards for data collection in the source systems are tested and evaluated against appropriate data quality business rules/standards which are required to perform the data, information and statistical modeling described previously.
    Inbound data that does not meet data warehouse data quality business rules is not loaded into the data warehouse (for example, if a hierarchy is incomplete). While it is desirable for rejected and corrected records to occur in the operational system, if this is not possible then start dates for when the data can begin to be collected into the data warehouse may need to be adjusted in order to accommodate necessary source systems data entry u201Cre-worku201D. Existing systems and procedures may need modification in order to permanently accommodate required data warehouse data quality measures. Severe situations may occur in which new data entry collection transactions or entire systems will need to be either built or acquired.
    We have found that a powerful and flexible extraction, transformation and loading (ETL) process is to use Structured Query Language (SQL) views on host database management systems (DBMS) in conjunction with a good ETL tool such as SAS® ETL Studio. This tool enables you to perform the following tasks:
    u2022 The extraction of data from operational data stores
    u2022 The transformation of this data
    u2022 The loading of the extracted data into your data warehouse or data mart
    When the data source is a u201Cnon-DBMSu201D data set it may be advantageous to pre-convert this into a SAS® data set to standardize data warehouse metadata definitions. Then it may be captured by SAS® ETL Studio and included in the data warehouse along with any DBMS source tables using consistent metadata terms. SAS® data sets, non-SAS® data sets, and any DBMS table will provide the SAS® ETL tool with all of the necessary metadata required to facilitate productive extraction, transformation and loading (ETL) work.
    Having the ability to utilize standard structured query language (SQL) views on host DBMS systems and within SAS® is a great advantage for ETL processing. The views can serve as data quality filters without having to write any procedural code. The option exists to u201Cmaterializeu201D these views on the host systems or leave them u201Cun-materializedu201D on the hosts and u201Cmaterializeu201D them on the target data structure defined in the SAS® ETL process. These choices may be applied differentially depending upon whether you are working with u201Ccurrent onlyu201D or u201Ctime seriesu201D data. Different deployment configurations may be chosen based upon performance issues or cost considerations. The flexibility of choosing different deployment options based upon these factors is a considerable advantage.
    4
    SUGI 30 Focus Session
    Data Marts
    This component of the data warehouse architecture may manifest as the following:
    u2022 Customer u201Cvisibleu201D relational tables
    u2022 OLAP cubes
    u2022 Pre-determined parameterized and non-parameterized reports
    u2022 Ad-hoc reports
    u2022 Spreadsheet applications with pre-populated work sheets and pivot tables
    u2022 Data visualization graphics
    u2022 Dashboard/scorecards for performance indicator applications
    Typically a business intelligence (BI) project may be scoped to deliver an agreed upon set of data marts in a project. Once these have been well specified, the conceptual data model (CDM) is used to determine what parts need to be built or used as a reference to conform the inbound data from any new project. After the detailed data mart specifications (DDMS) have been verified and the conceptual data model (CDM) components determined, a source and target logical data model (LDM) can be designed to integrate the detailed data mart specification (DDMS) and conceptual data model (CMD). An extraction, transformation and loading (ETL) process can then be set up and scheduled to populate the logical data models (LDM) from the required data sources and assist with any time series and data audit change control requirements.
    Over time as more and more data marts and logical data models (LDMu2019s) are built the conceptual data model (CDM) becomes more complete. One very important advantage to this implementation methodology is that the order of the data marts and logical data models can be entirely driven by project priority, project budget allocation and time-to-completion constraints/requirements. This data warehouse architecture implementation methodology does not need to dictate project priorities or project scope as long as the conceptual data model (CDM) exercise has been successfully completed before the first project request is initiated.
    McMasteru2019s Data Warehouse design
    DevelopmentTestProductionWarehouseWarehouseWarehouseOtherDB2 OperationalOracle OperationalETLETLETLETLETLETLETLETLETLDataMartsETLETLETLDataMartsDataMartsDB2/Oracle BIToolBIToolBIToolNoNoUserUserAccessAccessUserUserAccessAccess(SAS (SAS Data sets)Data sets)Staging Area 5
    SUGI 30 Focus Session
    Publication Services
    This is the visible presentation environment that business intelligence (BI) customers will use to interact with the published data mart deliverables. The SAS® Information Delivery Portal will be utilized as a web delivery channel to deliver a u201Cone-stop information shoppingu201D solution. This software solution provides an interface to access enterprise data, applications and information. It is built on top of the SAS Business Intelligence Architecture, provides a single point of entry and provides a Portal API for application development. All of our canned reports generated through SAS® Enterprise Guide, along with a web-based query and reporting tool (SAS® Web Report Studio) will be accessed through this publication channel.
    Using the portalu2019s personalization features we have customized it for a McMaster u201Clook and feelu201D. Information is organized using pages and portlets and our stakeholders will have access to public pages along with private portlets based on role authorization rules. Stakeholders will also be able to access SAS® data sets from within Microsoft Word and Microsoft Excel using the SAS® Add-In for Microsoft Office. This tool will enable our stakeholders to execute stored processes (a SAS® program which is hosted on a server) and embed the results in their documents and spreadsheets. Within Excel, the SAS® Add-In can:
    u2022 Access and view SAS® data sources
    u2022 Access and view any other data source that is available from a SAS® server
    u2022 Analyze SAS® or Excel data using analytic tasks
    The SAS® Add-In for Microsoft Office will not be accessed through the SAS® Information Delivery Portal as this is a client component which will be installed on individual personal computers by members of our Client Services group. Future stages of the project will include interactive reports (drill-down through OLAP cubes) as well as balanced scorecards to measure performance indicators (through SAS® Strategic Performance Management software). This, along with event notification messages, will all be delivered through the SAS® Information Delivery Portal.
    Publication is also channeled according to audience with appropriate security and privacy rules.
    SECURITY u2013 AUTHENTICATION AND AUTHORIZATION
    The business value derived from using the SAS® Value Chain Analytics includes an authoritative and secure environment for data management and reporting. A data warehouse may be categorized as a u201Ccollection of integrated databases designed to support managerial decision making and problem solving functionsu201D and u201Ccontains both highly detailed and summarized historical data relating to various categories, subjects, or areasu201D. Implementation of the research funding data mart at McMaster has meant that our stakeholders now have electronic access to data which previously was not widely disseminated. Stakeholders are now able to gain timely access to this data in the form that best matches their current information needs. Security requirements are being addressed taking into consideration the following:
    u2022 Data identification
    u2022 Data classification
    u2022 Value of the data
    u2022 Identifying any data security vulnerabilities
    u2022 Identifying data protection measures and associated costs
    u2022 Selection of cost-effective security measures
    u2022 Evaluation of effectiveness of security measures
    At McMaster access to data involves both authentication and authorization. Authentication may be defined as the process of verifying the identity of a person or process within the guidelines of a specific
    6
    SUGI 30 Focus Session
    security policy (who you are). Authorization is the process of determining which permissions the user has for which resources (permissions). Authentication is also a prerequisite for authorization. At McMaster business intelligence (BI) services that are not public require a sign on with a single university-wide login identifier which is currently authenticated using the Microsoft Active Directory. After a successful authentication the SAS® university login identifier can be used by the SAS® Meta data server. No passwords are ever stored in SAS®. Future plans at the university call for this authentication to be done using Kerberos.
    At McMaster aggregate information will be open to all. Granular security is being implemented as required through a combination of SAS® Information Maps and stored processes. SAS® Information Maps consist of metadata that describe a data warehouse in business terms. Through using SAS® Information Map Studio which is an application used to create, edit and manage SAS® Information Maps, we will determine what data our stakeholders will be accessing through either SAS® Web Report Studio (ability to create reports) or SAS® Information Delivery Portal (ability to view only). Previously access to data residing in DB-2 tables was granted by creating views using structured query language (SQL). Information maps are much more powerful as they capture metadata about allowable usage and query generation rules. They also describe what can be done, are database independent and can cross databases and they hide the physical structure of the data from the business user. Since query code is generated in the background, the business user does not need to know structured query language (SQL). As well as using Information Maps, we will also be using SAS® stored processes to implement role based granular security.
    At the university some business intelligence (BI) services are targeted for particular roles such as researchers. The primary investigator role of a research project needs access to current and past research funding data at both the summary and detail levels for their research project. A SAS® stored process (a SAS® program which is hosted on a server) is used to determine the employee number of the login by checking a common university directory and then filtering the research data mart to selectively provide only the data that is relevant for the researcher who has signed onto the decision support portal.
    Other business intelligence (BI) services are targeted for particular roles such as Vice-Presidents, Deans, Chairs, Directors, Managers and their Staff. SAS® stored processes are used as described above with the exception that they filter data on the basis of positions and organizational affiliations. When individuals change jobs or new appointments occur the authorized business intelligence (BI) data will always be correctly presented.
    As the SAS® stored process can be executed from many environments (for example, SAS® Web Report Studio, SAS® Add-In for Microsoft Office, SAS® Enterprise Guide) authorization rules are consistently applied across all environments on a timely basis. There is also potential in the future to automatically customize web portals and event notifications based upon the particular role of the person who has signed onto the SAS® Information Delivery Portal.
    ARCHITECTURE (PRODUCTION ENVIRONMENT)
    We are currently in the planning stages for building a scalable, sustainable infrastructure which will support a scaled deployment of the SAS® Value Chain Analytics. We are considering implementing the following three-tier platform which will allow us to scale horizontally in the future:
    Our development environment consists of a server with 2 x Intel Xeon 2.8GHz Processors, 2GB of RAM and is running Windows 2000 u2013 Service Pack 4.
    We are considering the following for the scaled roll-out of our production environment.
    A. Hardware
    1. Server 1 - SAS® Data Server
    - 4 way 64 bit 1.5Ghz Itanium2 server
    7
    SUGI 30 Focus Session
    - 16 Gb RAM
    - 2 73 Gb Drives (RAID 1) for the OS
    - 1 10/100/1Gb Cu Ethernet card
    - 1 Windows 2003 Enterprise Edition for Itanium
    2 Mid-Tier (Web) Server
    - 2 way 32 bit 3Ghz Xeon Server
    - 4 Gb RAM
    - 1 10/100/1Gb Cu Ethernet card
    - 1 Windows 2003 Enterprise Edition for x86
    3. SAN Drive Array (modular and can grow with the warehouse)
    - 6 u2013 72GB Drives (RAID 5) total 360GB for SAS® and Data
    B. Software
    1. Server 1 - SAS® Data Server
    - SAS® 9.1.3
    - SAS® Metadata Server
    - SAS® WorkSpace Server
    - SAS® Stored Process Server
    - Platform JobScheduler
    2. Mid -Tier Server
    - SAS® Web Report Studio
    - SAS® Information Delivery Portal
    - BEA Web Logic for future SAS® SPM Platform
    - Xythos Web File System (WFS)
    3. Client u2013Tier Server
    - SAS® Enterprise Guide
    - SAS® Add-In for Microsoft Office
    REPORTING
    We have created a number of parameterized stored processes using SAS® Enterprise Guide, which our stakeholders will access as both static (HTML as well as PDF documents) and interactive reports (drill-down) through SAS® Web Report Studio and the SAS® Add-In for Microsoft Office. All canned reports along with SAS® Web Report Studio will be accessed through the SAS® Information Delivery Portal.
    NEXT STEPS
    Next steps of the project include development of a financial data mart along with appropriate data quality standards, monthly frozen snapshots and implementation of university-wide financial reporting standards. This will facilitate electronic access to integrated financial information necessary for the development and maintenance of an integrated, multi-year financial planning framework. Canned reports to include monthly web-based financial statements, with drill-down capability along with budget templates automatically populated with data values and saved in different workbooks for different subgroups (for example by Department). The later will be accomplished using Microsoft Direct Data Exchange (DDE).
    8
    SUGI 30 Focus Session
    As well, we will begin the implementation of SAS® Strategic Performance Management Software to support the performance measurement and monitoring initiative that is a fundamental component of McMasteru2019s strategic plan. This tool will assist in critically assessing and identifying meaningful and statistically relevant measures and indicators. This software can perform causal analyses among various measures within and across areas providing useful information on inter-relationships between factors and measures. As well as demonstrating how decisions in one area affect other areas, these cause-and-effect analyses can reveal both good performance drivers and also possible detractors and enable u2018evidenced-basedu2019 decision-making. Finally, the tool provides a balanced scorecard reporting format, designed to identify statistically significant trends and results that can be tailored to the specific goals, objectives and measures of the various operational areas of the University.
    LESSONS LEARNED
    Lessons learned include the importance of taking a consultative approach not only in assessing information needs, but also in building data hierarchies, understanding subject matter, and in prioritizing tasks to best support decision making and inform senior management. We found that a combination of training and mentoring (knowledge transfer) helped us accelerate learning the new tools. It was very important to ensure that time and resources were committed to complete the necessary planning and data quality initiatives prior to initiating the first project. When developing a project plan, it is important to

  • I am trying to findout what carrier my playbook 4GLTE is locked to.

    I am trying to findout what carrier my playbook 4GLTE is locked to.
    I bought playbook at an auction done from work for a charity for a Toronto youth fund. the box says telus and all info on box matches serial number and imei.  i have tried using my rogers sim, went to bell and checked, also chatted with telus online and by going to the store. playbook does not accpet any of these carriers sims. i have tried buy unlock code  and so far 8 codes dis not work. i have only two tries left. i have reached out to organizers of auction and have been waiting with no word from them.
    what do i do. playbook works on wfi. the only reason i bought this playbook was to have a tablet for one of my data lines that, i can use when out and about.  i tried calling blackberry support and the way the person i spoke to last time i called  addressed me was very shameful even though i have registered for the complimentary 90days support. i have 3 other playbooks that i use in my home by all are wifi models only.
    details of playbook is below. will really appreciate anyones help.
    IMEI is <removed> and the serial <removed>. the prd from the box is prd-<Removed>.
    playbook is on OS 2.1.0.1526
    MOD EDIT: Removed personal information to comply with Community Guidelines and Terms and Conditions of Use.

    When I restart my iPad the connect to iTunes comes up and stays on until it shuts down again, I have read a report from Apple support suggesting I reinstall iTunes so I might try that again and also your suggestion which i shall also try, but thanks again - rg1547

  • I had to erase my data to upgrade my iPhone software from an older version. I backed up first, but seem to have lost all my notes.   Does anyone know how retrieve my old notes?

    I had to erase my data to upgrade my iPhone software from an older version. I backed up first, but seem to have lost all my notes.   Does anyone know how retrieve my old notes?

    Is this the note you are referring to:
    Not sure if this can help you.
    I checked my notes. They are stored here:
    MacintosHD/Users/MYHOME/Library/Containers/com.apple.notes/Data/Library/CoreData /ExternalRecords/
    xxxxxserie.of.numbers.probably.yours.are.different/IMAPNote/_records/0/
    If I go tot this window on finder and enter Time Machine I can go back and find old notes...
    If so, am I doing this on my new hard drive on on the time machine? When I do this on the time machine, what is "my home"? Also, I am unable to find "containers".
    If I go to users/jeff/library, then I see components, compositions, contextual, etc, but not containers.
    Thanks

  • How to find what transaction an user was running for a given period

    hi
    could anybody tel me
    how to find what transaction a particular user was running for a given period
    in the past.............

    Hi,
    U need to findout the list of Tcode excuted by SAP user
    1) Tcode: ST03N
    2) Select Expert Mode option
               (there u find  Server Host name & Today)
       If u select Host name u find DAY/WEEK/Month (same thing will in Today option)
    3)Now u need to select the option of DAY,WEEK etc.
    Now u can find in two ways
    1) Transaction profile (Standard / EarlyWatch)
    2) User (user profile)
    If u select the Transaction profile
    Under these u find the option of Aggregation----application/package/tranasation etc.
    I hope these will help u to findout.
    Regards
    ASR

  • How to use the not exists query

    Hello,
    Using obiee 11g,
    i would like to know how to deal with not exists query in the obiee11g,
    the query is like this
    select count(a.col1),a.col2 from table1 a ,table2 b where a.col1=b.col1
    and not exists
    (select d.col1 from table3 d, table4 e
    where d.col1=e.col1 and d.col1=a.col1)
    group by a.col2
    there are 4 tables.
    table1,table2,table3,tabel4
    this is what i need in a report,these tables are used in othere places also,
    would prefer if its possible to do in for a report i.e.e analysis itself.
    If not possible then may be in rpd since if i change in rpd it may reflect every place
    and not in for a report itself.
    thanks

    Implement this in report itself can be a tricky and tedious job..But better to have it in RPD - Physical - Select Table type
    USE NOT IN in place of NOT Exist..You can simply use SELECT statement in SELECT TABLE TYPE
    Hope it clear

  • How to tell what fonts are missing in a Photosohp CS5 document?

    Hello!  I have received some work from a graphic designer.  I need to modify some of the text in the document.  When I open the document I get an error message:
    Some text layers might need to be updated before they can be used for vector based output.  Do you want to update these layers now?  [Update] [No]
    If I click [Update] it replaces the font with Myriad Pro.  If I click [No] the text looks the way the designer intended, but I have the exclamation mark-triangle symbol over the layer with the text.  If I try to edit the text I get another warning:
    Editing or rendering the text layer "Text Layer" will cause its layout to change.  Continue?  [Cancel]  [OK]
    The same behavior described above occurs.  If I click [Cancel] I obviously can't edit the text.  If I click [OK] it changes the font to Myriad Pro.
    I would like to determine what font the layer is using.  I could swear that previous versions of Photoshop would tell me what fonts were missing when I opened a document.  I can't use the text tool to select the text without it changing the font.  Other posts have mentioned using the text tool to look at the text drop-down to find a greyed-out font which indicates the missing font.  I do not have that.  I only have perhaps 200 fonts installed.  Can anyone tell me how to determine what fonts a .PSD document is using?  I would prefer not to bug the graphic designer if possible.  Thanks much!
    ~ Dan

    Not sure why you are getting ""Font is present on the system but requires a layout change.", usually you do not get that instead the name of the font shows.
    We use extensis Universal Type Client, so the fonts load automatically upon opening files, and never really get missing fonts in Photoshop.
    Actually fontbook has auto font loading if you want to give that a try. Fontbook >> Preferences

Maybe you are looking for

  • TS2776 Strong disatisfaction with the latest overall upgrade to the IPhone, including the ICal functionality and ease of use.

    Seems like every change has been a step backward. It's harder to see what you are looking at and harder to use. It is downright confusing. At a meeting with 6 colleges, everyone HATED the new upgrade as it changed the calendar, the wallpaper, and pho

  • How to call back to C via JNI in Java started from C?

    Hi! This problem might seem outlandish, but I have not been able to find any other method to reach my goal. The situation is the following. There is a C++ program I intend to interface with a piece of Java code, called the manager. The C++ code invok

  • Setting the Crop Tool Dimensions Automatically in the Middle of a Script

    Is there a way to set the crop tool to an 8"x10" dimension in the middle of a script? I have a script where it prompts the user to set the crop tool manually to 8"x10" and then crop the image manually, and continues thereafter, and in another part of

  • Problems with kXMLParser (kXML 2)

    Hi colleagues, when I run my MIDlet which parses an XML file with the kXMLParser (kXML 2) I get the following exception: org.xmlpull.v1.XmlPullParserException: PI must not start with xml (position:unknown ���@1:7 in java.io.InputStreamReader@d590dbc)

  • AQJmsSession OCI error

    Hello, I get the followoing dump with AQJmsSession using the oci8 driver: java.lang.NoSuchFieldError: enqueue_time oracle.jms.AQJmsSession.ociinit (native method) oracle.jms.AQJmsSession.getOCIHandles(...) Running oracle 9.2 on AIX. I don't get an er