Problem with output /  Data format

Using CFMX 7.0.2
Gupta SQLBase DB
Here is my Query:
<cfquery datasource="_Casewin" name="HomeData">
SELECT CRL_RLINE.RDESC, CRL_RLINE.QUESNO
FROM CRL_RLINE
WHERE CRL_RLINE.R_CASE = '#FORM.CASE_NO#'
</cfquery>
RDESC is a memo data type, but returns this:
1 ðÙT393252 rODBC Ser¨{² \©
2 ðÙ
3
where the actual data is this:
CRL_RLINE.QUESNO CRL_RLINE.RDESC
1 NEIGHBOUR REMOVED FENCING AND WILL NOT REPLACE
2 N.S.W
3 NO
I've tried string formats, trims, justify
any ideas?

Thanks Grizzly
The driver is the Gupta SQLBase driver, other queries work
fine from all connections. it just seems to be this one table
throwing the wierd symbols in Coldfusion only.
The Database front end that enters the data is fine, doing
SQL queries in SQLBase, even in excel and access with linked tables
work fine. It is only the output through coldfusion.
This is the only field that uses the memo format, so not sure
if that is the reason

Similar Messages

  • Problem with input data format - not "only" XML

    Hi Experts,
    I have problem with input data format.
    I get some data from JMS ( MQSeries) , but input format is not clear XML. 
    This is some like flat file with content of XMLu2026.
    Example:
    0000084202008-11-0511:37<?xml version="1.0" encoding="UTF-8"?>
    <Document xmlns="urn:xsd:test.01" xmlns:xsi=http://www.w3.org/2001/XMLSchema-instance Sndr="0001" Rcvr="SP">
    ...content....
    </Document>000016750
    Problems in this file is :
    1. data before parser <? xml version="1.0"> -> 0000084202008-11-0511:37 
    2. data after last parser </Document> -> 000016750
    This data destroy XML format.
    Unfortunately XI is not one receiver of this files and we canu2019t change this file format in queue MQSeries ( before go to XI) .
    My goal is to get XML from this file:
    <?xml version="1.0" encoding="UTF-8"?>
    <Document xmlns="urn:xsd:test.01" xmlns:xsi=http://www.w3.org/2001/XMLSchema-instance Sndr="0001" Rcvr="SP">
    ...content....
    </Document>
    My questions:
    1. Is any way or technique to delete this data 0000084202008-11-0511:37  in XI from this file ?
    2. Is any way to get only XML from this file ?
    Thanx .
    Regards,
    Seo

    Hi Buddy
    What is the XI adapter using?
    If you use inbound File adapter content conversion you could replace these values with none and then pass it to the scenario.
    Does that help?
    Regards
    cK

  • HT3345 Importing from xls problems with the date format (mixed european format)

    Hi.
    I made some spreadsheets with Neo Office and saved them in xls format. So I can import them in numbers, doing so in OpenOffice, LibreOffice or even the worst in Mircosoft Office on a PC works just fine. But importing them with Numbers 2.0.4 it just don't work.
    I have Mac OSX 10.5.8 and a PowerPC G4 1,67 GHz.
    The date format in all other programs is " 21.05.2011" and numbers creates "21/05/2011" out of it. Although my international setting is set to 23.05.2012. It defines the cells as my own format and takes than the totally wrong format.
    Does anyone know how to solve this problem, or how to change all the weird own formats in my international setting, which would be correct.
    Best regards.

    Hi Franklin,
    I used to have similar problem with date too. To make my forms and reports work for all date, I used to send the parameter from forms to the report in text format. In the report, I grab that parameter as "Text Format" (I mean the parameter created in Oracle Reports was really as Character in datatype property) then I manipulate them in my SQL to convert that "date" into real date using TO_DATE function. Usually I use DD-MM-YYYY as my date format.
    Hope this help.
    Regards,
    Franko Lioe
    Hello friends at www.oracle.com,
    when I sent some informations to Reports, one of these informations was a date field. Here, the format mask may vary from one computer to another - some computers here are using american date format; however, my computer was using brazilian date format.
    The fact is that I was sending date information to Reports using the mask dd/mm/yyyy, and Reports (in the computer that uses american date format) couldn't recognize it, stating that I was sending an invalid month. So I had to change my date information to dd-mon-yyyy and Reports could run in the other computer, but doesn't run here anymore - fact that obligated me to change my date format to american format.
    Is there any way for me to use a date format mask that's valid to all computers here? If other computers - with other format masks - meet the same problem, the use of this program may become something very complicated.
    Thanks for all answers,
    Franklin Gongalves Jr.
    [email protected]

  • Problem with output data when calling into Oracle stored procedure

    I have a problem that I think I've seen posted by others, but I can't find it anywhere on the forum. Here's what it looks like:
    I have a application that sends a query parameter called custID in a URL to a JSP page.
    http://domain.com/decrypt.jsp?custID=ewsw
    The JSP subsequently calls into the method below to run a decryption stored procedure on an Oracle db. The custId parameter works fine from most clients. However, I have seen the decryption stored procedure return invalid information on some clients in the following case:
    http://domain.com/decrypt.jsp?custID='eirwx
    Here is how I define my call to the DCUSTID (decryption stored procedure):
    public long dCustID(String sCustID)throws SQLException {
    CallableStatement cs = null;
    try {
    long nCustID = 0;
    if(conn == null || conn.isClosed()) {getLocalConnection();}
    String sp = "BEGIN DCUSTID(?,?);END;";
    cs = conn.prepareCall(sp);
    cs.setString(1, sCustID);
    cs.registerOutParameter(2,java.sql.Types.NUMERIC);
    cs.execute();
    nCustID = cs.getLong(2);
    return nCustID;
    }finally { release(cs); }
    I have not been able to find a problem with the stored procedure (although Im not counting that out). Is there any way that the jsp or the code above is corrupting the data before it gets to the stored procedure? I'm very suspicious of the single quote at the begining of the custID query string parameter.
    Thanks

    >
    The JSP subsequently calls into the method below to
    run a decryption stored procedure on an Oracle db.
    The custId parameter works fine from most clients.
    However, I have seen the decryption stored procedure
    e return invalid information on some clients in the
    following case:
    The back tick doesn't matter.
    How do you know that the routine that you posted returns invalid information? Are you printing the value retreived before you return in the code that you posted? That is the only way to tell if that code is the problem vs some translation problem in something else.
    And what do you mean by 'invalid'? It does return a numeric value right? Is it negative? Or so large that it couldn't be a key? Or what?
    The input parameter to the stored proc is a varchar and not a char correct?

  • Problem with the Date format and sending it to R/3

    Hi all,
    I have an Adaptive RFC application which fetches the data from the R/3 and displays the Employee's Personal Data.
    I am using a ZFM which is customised as per the requirement from the Standard BAPI_EMPLOYEE_GETDATA.
    Now the Problem i am facing is Date is default and Optional Parameter in RFC.
    If i execute the RFC in R/3 without Date it is not filling the Internal Tables. And if i pass the Date it is filling the Internal Table and fetches the Records.
    So in WD application i am inputing Employee FirstName, LastName or the Employee ID.
    User can give any of the above input or combination of FirstName and LastName Or only the Employee ID.
    Now the Main Problem is of Date which is of type Date.
    I have also tried the SimpleDateFormat Class, but i am not able to achieve the result.
    I have also seen the Links and threads on SDN, but unable to solve the problem.
    So pls help me out asap.
    Thanks & Regards,
    Dhruv Shah

    Hi ,
    By default , RFC accept date format of SQL date (yyyy-mm-dd) . If you are using a date picker from WD, it directly set the date in SQL date format. Incase if you are trying to pass date to RFC in some other way you have to convert that into SQL date format before passing.
    if you are passing String date of format dd-mm-yyyy , you try this method to convert that to SQL date and pass to your RFC.
    public java.sql.Date sqlDateConvert( String date)  {
        //@@begin sqlDateConvert()
         java.sql.Date dateObj=null;
         try{     
              StringTokenizer tempStringTokenizer = new StringTokenizer(""+date,"-");          int dd=Integer.parseInt(tempStringTokenizer.nextToken().trim());
                                    int mm=Integer.parseInt(tempStringTokenizer.nextToken().trim());
              mm=mm-1;
              int yyyy=Integer.parseInt(tempStringTokenizer.nextToken().trim());
              Calendar cal =Calendar.getInstance();   
              cal.set(yyyy,mm,dd);                         
              dateObj = new java.sql.Date( cal.getTime().getTime());
         }catch(Exception e)
              return dateObj;
    Hope this will help you.

  • Problem with Java Dates and UPDATE for SQL2000

    I am having problems with the date formats for Java. I am trying to put the current date time into a SQL table, here it the code I am using:
    var Today = new Date()
    var conn = Server.CreateObject( "ADODB.Connection" )
    conn.Open( "Provider=SQLOLEDB;Server=(local);Database=BillTracking;UID=sa;PWD=;")
    var sql = "UPDATE BillAssignments SET DatePosted = " + Today + " WHERE AssignmentID = '" + Request.QueryString("AssignmentID") + "'"
    var rs = conn.execute(sql)
    I keep getting different errors and I have been unable to find a solution yet. I know that I need to change the date format from the Java standard to the one that SQL likes.
    Help....
    Norm...

    Please tell us where the Java part of this comes in. I see that you are using JavaScript to load up data via an ADO connection (presumably on an IIS platform) - but I do not see where you are using Java
    Lee

  • Facing problem with a date column in select query

    Hi,
    I am facing problem with a date column. Below is my query and its fainling with " invalid number format model" .
    Query: SELECT *
    FROM EMP
    WHERE trunc(LAST_UPDATED) >= to_date(to_char(22-05-2009,'dd-mm-yyyy'),'dd-mm-yyyy')
    LAST_UPDATED column is "DATE" data type.
    Please help me Thanks

    Radhakrishna Sarma wrote:
    SeánMacGC wrote:
    WHERE LAST_UPDATED >= to_date('22-05-2009','dd-mm-yyyy');
    You do not need the TRUNC here in any case.
    I don't think so. What if the user wants only data for 22nd May and the table has records with date later than 22nd also? In that case your query willl not work. In order for the Index to work, I think the query can be written like this I think Sean is right though. Use of TRUNC Function is quiet useless based on the condition given here, since the to_date Function used by OP will always point to midnight of the specified date, in this case 22-05-2009 00:00:00.
    Regards,
    Jo
    Edit: I think Sean proved his point... ;)

  • Problem with loading data to Essbase

    Hi All,
    I have a problem with loading data into Essbase. I've prepared maxl script to load the data, calling rule file. The source table is located in RDBMS Oracle. The script works correctly, ie. generally loads data into Essbase.
    But the problem lies in the fact, that after deletion of data from Essbase, when I'm trying to load it again from the source table I get the message: WARNING - 1003035 - No data values modified by load of this data file - although there is no data in Essbase... I've also tried to change the mode of loading data from 'overwrite' to 'add to existing values' (in rule file) but it does'nt help ... Any ideas what can I do?

    Below few lines from EPM_ORACLE_INSTANCE/diagnostics/logs/essbase/dataload_ODL.err:
    [2013-09-24T12:01:40.480-10:01] [ESSBASE0] [AGENT-1160] [NOTIFICATION] [16][] [ecid:1379989900303,0] [tid:1116830016] Received Validate Login Session request
    [2013-09-24T12:01:40.482-10:01] [ESSBASE0] [AGENT-1001] [NOTIFICATION] [16][] [ecid:1379989900303,0] [tid:1114724672] Received client request: Get App and Database Status (from user [admin@Native Directory])
    [2013-09-24T12:01:54.488-10:01] [ESSBASE0] [AGENT-1001] [NOTIFICATION] [16][] [ecid:1379989900303,0] [tid:1101564224] Received client request: MaxL: Execute (from user [admin@Native Directory])
    [2013-09-24T12:01:54.492-10:01] [ESSBASE0] [AGENT-1001] [NOTIFICATION] [16][] [ecid:1379989900303,0] [tid:1115777344] Received client request: MaxL: Describe (from user [admin@Native Directory])
    [2013-09-24T12:01:54.492-10:01] [ESSBASE0] [MLEXEC-2] [NOTIFICATION] [16][] [ecid:1379989900303,0] [tid:1115777344] Output columns described
    [2013-09-24T12:01:54.494-10:01] [ESSBASE0] [AGENT-1001] [NOTIFICATION] [16][] [ecid:1379989900303,0] [tid:1102616896] Received client request: MaxL: Define (from user [admin@Native Directory])
    [2013-09-24T12:01:54.494-10:01] [ESSBASE0] [AGENT-1001] [NOTIFICATION] [16][] [ecid:1379989900303,0] [tid:1102616896] Received client request: MaxL: Fetch (from user [admin@Native Directory])
    [2013-09-24T12:01:54.494-10:01] [ESSBASE0] [MLEXEC-3] [NOTIFICATION] [16][] [ecid:1379989900303,0] [tid:1102616896] Record(s) fetched
    [2013-09-24T12:01:54.496-10:01] [ESSBASE0] [AGENT-1001] [NOTIFICATION] [16][] [ecid:1379989900303,0] [tid:1116830016] Received client request: MaxL: Fetch (from user [admin@Native Directory])
    [2013-09-24T12:01:54.498-10:01] [ESSBASE0] [AGENT-1160] [NOTIFICATION] [16][] [ecid:1379989900303,0] [tid:1114724672] Received Validate Login Session request
    [2013-09-24T12:01:54.499-10:01] [ESSBASE0] [AGENT-1001] [NOTIFICATION] [16][] [ecid:1379989900303,0] [tid:1101564224] Received client request: Get Application State (from user [admin@Native Directory])

  • Problem with the date conversion

    Hi Friends,
    i am facing the problem with the date conversion,  Actuall my requirement is to pass the date to the screen based on the user setting roles(SU01).
    I have fetched the user setting date format by using the funciton module SUSR_GET_USER_DEFAULTS, The function module picks the exact user date setting (Like as MM/DD/YYYY, MM.DD.YYYY, DD.MM.YY).
    After that i have implemented the FORMAT_DATE_4_OUTPUT funciton module for converting of the user role setting date format into system  date format.
    for the english language case the funciton module FORMAT_DATE_4_OUTPUT works fine but the funciton module not supported for other languages
    Can you please provide the Function Moudle for user setting date conversion.
    The funciton module is most important for us,
    Thanks
    Charan
    Moderator message: date conversion questions = FAQ, please search before posting.
    Edited by: Thomas Zloch on Dec 21, 2010 2:19 PM

    Hope this logic helps you.
    DATA LF_DATE    TYPE DATS VALUE '21122010'. " 21-dec-2010
    DATA LF_DATE_BI(10).
    WRITE LF_DATE TO LF_DATE_BI.  "Now LF_DATE_BI contains the date in user format
    "Now populate the value LF_DATE_BI to the screen field

  • Problems with retrieving data from tables with 240 and more records

    Hi,
    I've been connecting to Oracle 11g Server (not sure exact version) using Oracle 10.1.0 Client and O10 Oracle 10g driver. Everything was ok.
    I installed Oracle 11.2.0 Client and I started to have problems with retrieving data from tables.
    First I used the same connection string, driver and so on (O10 Oracle 10g) then I tried ORA Oracle but with no luck. The result is like this:
    I'm able to connect to database. I'm able to retrieve data but from small tables (e.g. with 110 records it works perfectly using both O10 and ORA drivers). When I try to retrieve data from tables with like 240 and more records retrieval simply hangs (nothing happens at all - no error, no timeout). Application seems to hang forever.
    I'm using Powerbuilder to connect to Database (either PB10.5 using O10 driver or PB12 using ORA driver). I used DBTrace, so I see that query hangs on the first FETCH.
    So for the retrievals that hang I have something like:
    (3260008): BIND SELECT OUTPUT BUFFER (DataWindow):(DBI_SELBIND) (0.186 MS / 18978.709 MS)
    (3260008): ,len=160,type=DECIMAL,pbt=4,dbt=0,ct=0,prec=0,scale=0
    (3260008): ,len=160,type=DECIMAL,pbt=4,dbt=0,ct=0,prec=0,scale=1
    (3260008): ,len=160,type=DECIMAL,pbt=4,dbt=0,ct=0,prec=0,scale=0
    (3260008): EXECUTE:(DBI_DW_EXECUTE) (192.982 MS / 19171.691 MS)
    (3260008): FETCH NEXT:(DBI_FETCHNEXT)
    and this is the last line,
    while for retrievals that end, I have FETCH producing time, data in buffer and moving to the next Fetch until all data is retrieved
    On the side note, I have no problems with retrieving data either by SQL Developer or DbVisualizer.
    Problems started when I installed 11.2.0 Client. Even if I want to use 10.0.1 Client, the same problem occurs. So I guess something from 11.2.0 overrides 10.0.1 settings.
    I will appreciate any comments/hints/help.
    Thank you very much.

    pgoel wrote:
    I've been connecting to Oracle 11g Server (not sure exact version) using Oracle 10.1.0 Client and O10 Oracle 10g driver. Everything was ok.Earlier (before installing new stuff) did you ever try retrieving data from big tables (like 240 and more records), if yes, was it working?Yes, with Oracle 10g client (before installing 11g) I was able to retrieve any data, either it was 10k+ records or 100 records. Installing 11g client changed something that even using old 10g client (which I still have installed) fails to work. The same problem occur no matter I'm using 10g or 11g client now. Powerbuilder hangs on retrieving tables with more than like 240 records.
    Thanks.

  • Problem with universal data cleanse

    I have problem with universal data cleanse.
    I am using DS 3.2.x (12.2.2).
    I created:
    Dictionary: TEST
    Classification: TEST_CL
    Custom outputs: category: TEST_CAT
    Dictionary Entry:
    Primary: BO
    Classification: TEST_CL
    Gender: Unassigned
    Secondary information:
    When used as: TEST_CAT
    Standard: Business Objects
    Rule file:
    DataCleanse Rule File v2.0;
    TEST_CAT = TEST_CL;
    action = TEST_CAT;
    TEST_CAT = 1 : TEST_CAT : 1;
    end_action
    Data Cleanse:
    Input: Multiline1
    Options:
    Parsing Dictionary: TEST
    Rule file: .../test_rules.dat
    Break On Whitespace Only: Yes
    Parser Sequences Multiline1: TEST_CAT
    Output:
    Parent_component: TEST_CAT1
    Generated_field_name: TEST_CAT and RULE_LABEL
    Generated_field_class: parsed/standardized
    Contenet_type: none
    and EXTRA field
    I wanted to replace word u201CBOu201D with the standard words u201CBusiness Objectu201D, but I have result: there is "BO" in the extra field,
    there are "null" in others fields.
    What am I doing wrong?
    Thanks for all help!
    P.S. I don't have cleansing packages installer.

    There seem to be a couple of things going on here:
    1. If you are using your custom dictionaries, then you have to map your input to MULTILINE1 and enable the custom parsers - just something to be aware of
    2. You mentioned that you made some changes to the existing dictionary and you are not seeing any changes. To be clear, do you have different TEST and PRODUCTION environments? Or is it the same environment except that you have a local DS repository and for the dictionary you are pointing to another repository (using Dictionary --> Manage Connections)?
    Having the dictionaries on a different repository should not make any difference as long as you point to them in your designer using the Dictionary --> Manage Connection option.
    So I think there may be some issue with your job setup and/or dictionary values need to be looked at. You can start by adding another output field named "EXTRA" to see whether or not your data is getting parsed at all. Also, make sure the entry "CLEANME" is classified as FIRM_NAME_ALONE in the dictionary and that you are selecting the correct dictionary name in the Datacleanse Transform options.

  • Spotify and facebook problem with the date

    Hi, everyone.My spotify account have a problem with the date. I changed the date on facebook and nothing happens with that, the spotify account has a wrong date, is not the same as facebook account

    Hi ,
    By default , RFC accept date format of SQL date (yyyy-mm-dd) . If you are using a date picker from WD, it directly set the date in SQL date format. Incase if you are trying to pass date to RFC in some other way you have to convert that into SQL date format before passing.
    if you are passing String date of format dd-mm-yyyy , you try this method to convert that to SQL date and pass to your RFC.
    public java.sql.Date sqlDateConvert( String date)  {
        //@@begin sqlDateConvert()
         java.sql.Date dateObj=null;
         try{     
              StringTokenizer tempStringTokenizer = new StringTokenizer(""+date,"-");          int dd=Integer.parseInt(tempStringTokenizer.nextToken().trim());
                                    int mm=Integer.parseInt(tempStringTokenizer.nextToken().trim());
              mm=mm-1;
              int yyyy=Integer.parseInt(tempStringTokenizer.nextToken().trim());
              Calendar cal =Calendar.getInstance();   
              cal.set(yyyy,mm,dd);                         
              dateObj = new java.sql.Date( cal.getTime().getTime());
         }catch(Exception e)
              return dateObj;
    Hope this will help you.

  • Is anyone else having problems with Apertures Date fields?

    After experiencing a sysems failure I began restoring my images from backups.  I have had no end of problems with Image dates.  I have come to the determination that Aperture is NOT using either the "Create Date" or "DateTimeOriginal" as the image "Date" or "Date Created" data if other fields have other dates.
    Here is a clip from my image metadata using exiftool:
    File Modification Date/Time     : 2012:09:23 19:53:15-04:00
    File Access Date/Time           : 2013:01:16 11:34:38-05:00
    File Inode Change Date/Time     : 2013:01:16 11:31:32-05:00
    Create Date                          : 2006:02:20 11:51:12.10
    Date/Time Original              : 2006:02:20 16:51:12.10
    Modify Date                          : 2006:02:20 11:51:12.10
    And here is what Aperture utilized on import
    Date:                               9/23/12 7:53:15 PM EDT
    Date Created                         9/23/2012 7:53:15 PM
    So Aperture utlized the File Modification Date/Time as the Create Date despite that the fact that the Create Date field is present and is properly formatted in the original image.
    According to the Aperture mapping table, this shouldn't be happening.
    Now, before someone recommends that I use Aperture's Date Adjust utility - I'm talking about  slightly over 30,000 images.  Editing images one at a time, or in blocks when you don't know what field Aperture is using as the Create Date would require individual inspection of each image followed by manual adjustment of each image.  That approach isn't acceptable.

    Hmm ok, let me restate the issue then, I thought I was clear.  I could have pointed out in my original post however, that only the EXIF and IPTC date fields were displayed from the metadata dump.
    The data set presented in my first post is the EXIF dataset from the file which clearly shows the image was captured by digital camera at
    Create Date                          : 2006:02:20 11:51:12.10
    Date/Time Original              : 2006:02:20 16:51:12.10
    (The delta of 5 hours is the result of Zulu versus local time offset)
    But, when I ingested it into Aperture, the import routine utilized the IPTC field
    File Modification Date/Time     : 2012:09:23 19:53:15-04:00
    which is updated by the OS whenever you move the file around outside of Aperture. (a number of image data fields are updated by the OS - filename for example is another.).  This resulted in the image date fields being stamped in Aperture as:
    Date:                               9/23/12 7:53:15 PM EDT
    Date Created                         9/23/2012 7:53:15 PM
    Which obviously came form the File Modification Date/Time field and not (either) the Create Date or Date/Time Original fields.

  • Problem with order data

    Hi, I have a problem with order data. A string column has these values:
    A....
    B....
    PSTA-FRA
    PSTA+FRA
    Q....
    R....
    If i order data directly from db ( select column1 from table order by column1 ) , i obtain this result:
    PSTA-FRA
    PSTA+FRA
    If i create a simple report on that table and order on that column the result is:
    PSTA+FRA
    PSTA-FRA
    I can obtain correct output if I set parameter 'Perform group on server' but if i write a selection formula :
    {table.column} >= 'PSTA+FRA'
    record containing PSTA-FRA value not appear.
    I've tried both with Crytal 8.5 and Crystal 11 R2 sp6, database is Sybase SQL Anywhere odbc connection.
    Thanks in advance

    Ordering is done according to the ASCII values of the characters.
    In R2 create a new report and log onto your DB and select Command for your data source. Paste in the SQL you use to get the data in the order the server users. Then you don't have to use CR to do the sorting. DB Servers are much more efficient at collecting the data than CR is.
    Thank you
    Don

  • Problem with billing date at closed period

    Hi
    We have a problem with billing date in transaction . Previously we could set up different (earlier) billing date than the actual posting period and now it is not working. We receive an error message that the posting period is closed ) .
    I attached an example invoice which we managed to post and where you can see what Iu2019m thinking about, parking number: 1024337, BILLING DOCUMENT: 841835. In this case the billing date was at 30.11.2007, but the posting in 15.07.2008 and we managed to post.
    Pls help that issue
    yps

    hi,
    it is due to posting period closed for the month of 07/2008.
    u need to open posting during that period.
    Regards,
    Greeshma

Maybe you are looking for

  • Can I use an Apple Gift Card to buy a Photo Book from iPhoto

    Is it possible to use an Apple Gift Card to buy a Photo Book that was created in iPhoto '11?

  • Ipod mini hard drive goes crunch and now get Sad face with IPOD

    I've notice that after I installed new ITUNES and Updater software, my mini has the following problems: 1. Click to the song and it just sits there and then it might skip over a the next song or songs before playing, like it is frozen 2. I hear the h

  • Req to highlight few records with red color

    Hi gurus, We have a requirement where we need to display the record for which actual resolution date is greater than planned resolution date in red.I tried this using the calculted key fig which will give one whenever it is greater n 0 wen less n the

  • Adobe Premiere (CS6) crashes after start up - what could the problem be?

    I open up a project I have been working on for about a year now. It does have a fair amount of footage within it, but I recently cleared out some unused (draft) sequences and some back up files/images. Shortly after start up it crashes and displays t

  • Deleting email in iCloud

    Seems that my iCloud email on my Mac is not syncing with iCloud. My messages that I delete on my Mac keep reappearing and won't delete until I delete them using iCloud on-line! How do I fix this?!? Thanks.