Bad transparency in autoshapes

Hey.
I'm having a problem where I've put autoshapes with 55%
transparency over jpg pictures in word (2003). When I import the
word file to RH, the autoshapes are made into gifs, and they look
really terrible (they don't look transparent, but are pixelated).
Is there any way to make them into pngs, or to make them look
better?
A
picture of my problem
Thanks alot for the help!

No, it turns out it is not due to this bug (although this bug is also annoying in that you can't have transparent nodes). Even with opaque nodes, or with a custom cell renderer that just returns labels, the background gets garbled, so it must be in JTree itself.

Similar Messages

  • Transparan​t window background​?

    Is it possible to set a transparant background of the frontpanel, but still have non-transparant indicators show up? (so it looks like that indicators are floating on the screen?)
    Thanks in advance, hope it possible, still havent managed to do it  
    Cornelis
    Solved!
    Go to Solution.

    thanks a lot! This just solved my problem! I tried searching all over the forum, weird I didnt find it, perhaps my english was so bad, "transparant" is actually spelled "transparent"

  • Export Flash image to Illustrator : bad colors and no transparency

    I've tried to export my vector work from Flash to
    Illustrator.
    The result is bad : colors have faded and transparent colors
    became opaque.
    It is a shame that Flash doesn't provide a correct export
    function to
    Illustrator now that Flash and Illustrator are both Adobe's
    softwares.
    Does somebody know a better way or software to export
    Flash/SWF vector
    images to Illustrator or PDF?
    Thanks.
    Henri

    You may want to follow this thread–no answer so far, but we live in hope…
    http://forums.adobe.com/message/2341411#2341411

  • Bullets and Transparent text boxes

    hello,
    I've checked out the postings already for this issue and
    appreciate Ricks tip on fixing the font blur issue. However, I have
    another bug with bullets in transparent text boxes.
    Upon preview or publish of a project, I see a ghost color (a
    little lighter) of my background color which is light blue . So,
    basically the text box is no longer transparent and looks really
    bad.
    Any ideas,
    Lisa

    Hi Rick,
    Let me describe what is happening a little more clearly. Here
    is the basics so maybe you can reproduce it. I belive I did come up
    with a fix for myself.
    1) Add a background image to the slide that is merged not an
    overlay image
    - My image is a png file imported into the library and added
    to every slide. It has a solid light blue background with a dark
    blue strip on top and a brand from the client at the top left (very
    basic)
    2) Add a transparent text box caption
    3) Add text that includes bullet points
    Result: When you put a text caption with bullets over a slide
    background that is an image(even solid color) you get a ghost that
    oulines the text caption dimensions. If the text caption doesn't
    have bullets, the ghost outline doesn't appear, but the text is
    blurry. Obviously the text can be fixed by highlighting on
    character in #C0C0C0.
    My temporary fix: Because the ghost ouline seems to appear
    when a text caption with bullets is on top of a background image, I
    modifed my background image so that the light blue part (this is
    the area when all the content will be) is actually transparent and
    no longer a color. I then set color for all slides backgrounds to
    be the light blue that was origninally in my image. This seems to
    work, but there are some color fading issues that I need to explore
    and work out more.
    Thanks much and look forward to pushing the boundaries of
    Captivate once I my head around it more.
    P.S If I have a background image on each screen (branding and
    consistent purposes), when I set up a simulation, do I have to
    merge each screen shot into the background or can the screen shots
    created by Captivate just overlay my background image.
    Lisa

  • SCXI-1321 Bad Chips and Function Calls

    8-4-04
    This message is intended for the PSE of the SCXI-1321 and that person�s manager. Please route it accordingly. I would appreciate a phone response to 8**-2**-2*** so that I might share some specific experiences.
    To the managers of the SCXI product line,
    I have used the SCXI-1121 with the SCXI 1321 for some time now, approximately 6 years. Our company generally uses them for full bridge pressure transducers and similar applications. I have some suggestions that would make them more saleable. I also believe that you will never know how many customers you have already lost on this product line, for the reasons given below.
    1. For many years the shunt resistor on the 1321 has been vastly unusable. What SHOULD be possible is to command the shunt resistor through a Labview command, and then read the virtual channel to see the shunted value. That IS the purpose of the feature. What happens is that once you set the shunt command, then read the virtual channel, the very act of reading the channel UN-shunts the resistor. What you read is some value part way between, because the chip relay from Clare actually ramps open and closed, unlike a mechanical relay. If your timing loop to read the value is consistent, you will consistently read a BAD value. Several VI�s tried to command the shunt, then quickly read the virtual channel, then loop and do it again, and thought they had solved the problem. Further investigation showed that all that was really happening is that the readings appeared to be stable because of the fast loop time, but were actually mid-release of the shunt for all readings.
    2. NI eventually starting saying �don�t use virtual channels with the shunt�. In fact, there is a channel string that starts with the word �shunt� that was recommended. The only problem with this is that the data that comes back is not scaled! Keep in mind that NI has been pushing �virtual channels� on users for many years as the way to do business, and so entire test stands are developed around them and MAX. Because we depend on MAX to do the scaling, how are we supposed to be able to read the data scaled? Does NI expect customers to duplicate the efforts of MAX for every virtual channel and scale it separately? Do I have to keep duplicate tracking of scaling information so that my virtual channels will work in the main code, but again elsewhere so that my calibration routine can read the data? This is messy, and for most customers, untenable.
    3. I was told, literally, year after year and release after release, that the problem would be fixed in future versions of NI-DAQ. Each release proved that the problem was not resolved. I am sure that many customers were not repeat customers as they found the problems associated with using a simple shunt on the board. I myself needed this hardware due to the 1121/1321 parallel operation capability.
    4. After years of trying to fight with the issues above, and what seemed like random problems with the shunting of channels, a second hardware related issue has finally been identified. See SRQ#600753 with Michelle Yagle. The Clare solid state relays installed on the 14 different 1321�s I bought ALL had the same defect. Imagine what this means for troubleshooting! You swap boards to try to troubleshoot, but they are all defective and so you falsely rule out the board, the problem didn�t resolve with a different board. The lot# 0027T12627 of chips covers a wide range of 1321 serial numbers (see my SRQ for details). That means this impacts a lot of potential customers. The part is rated by the manufacturer to 80C, 176F. This particular lot stops functioning as low as 96F! A great deal of testing with heat guns, shop air, and a thermocouple showed that all the chips in this lot fail somewhere between 96 and 110F, well below manufacturers specs or NI�s specs on the board. I also tested lot 98, from a board two years older, as well as new chips from lot 03, which both worked perfectly all the way up to 212F (100C).
    Conclusions:
    I believe that the people that have tried to use your 1321 have generally had a miserable time of it, unless they already do their own scaling (like some of the Alliance members do.) This is because the shunt feature is not compatible with virtual channels, because NI did not offer a reasonable fix for this problem over a four+ year period, and does NOT advertise this limitation when people are making hardware selections. Further, I believe there are many people that have boards with the same bad lot of chips on them that don�t know it. They may have elected to hardwire their own resistors after fighting with random problems (temperature related). They might have elected to hardwire their own resistors to get around the NI-DAQ limitation. They might not even use the shunt resistor. But one thing is certain: NI needs to clean up this board�s operation, make sure that customers of the serial numbers in my SRQ named above are notified of the bad chips, and fully test that the use of the relays is transparent to LV virtual channels in the future. People will not trust NI products with this type of problems, and will not buy the product again if all the features don�t work!
    I have helped NI resolve similar issues with the SCXI-1126 years ago, which had a number of scaling issues in NI-DAQ (the board read great when you used 1k, 2k, 4k, 8k, etc. even ranges, but if you set a virtual channel to 0-2200, the readings were off.) I plotted the problem, documented it with my sales rep, and worked at length with tech support. The boards also had a lot of bad buffer amplifier chips that allowed cross talk between channels, causing a lot of other spurious problems. Those experiences were horribly frustrating, but I now buy the product with confidence. It took a lot of push on my part to convince NI there was a real underlying problem. This 1321 board is similar.
    I want NI products to be the best they can be, as I have dedicated my career around them. Please take this feedback to heart, and contact me at 8**-2**-2*** to talk about these issues. Additionally, I am speaking at NI week this year and will be available for discussion. Contact me at the same number.
    Sincerely,
    Tim Jones
    Test Equipment Design Engineer
    Space Shuttle Program
    Space, Land and Sea Enterprise, Hamilton Sundstrand

    Tim
    Thank you for your feedback for the SCXI-1321 terminal block.
    I wanted to see if it would be possible for you to move your application to NI DAQmx, the new driver for DAQ and SCXI as of NI-DAQ 7.0. I have tested the shunt calibration using a SCXI-1121 and SCXI-1321 and it works as it should. There are two ways to do the shunt calibration with DAQmx.
    The first is to use the Calibration feature in the DAQ Assistant. When creating a DAQmx Task in either MAX or LabVIEW, you can select the �Device� tab and click on the Calibration button. It will then ask you if you either want to do a null or shunt calibration, or both. It will then do the calibrations selected and save the values to the task.
    The other option to measure the shunted value is to use the AI.B
    ridge.ShuntCal.Enable property node in LabVIEW. By setting this property to True, the driver will enable the shunt for the current measurements. If you are taking a strain measurement, the value will be converted to strain.
    We are currently looking into the issues you are seeing with the Clare relays to see if other users could be affected.
    Brian Lewis
    Signal Conditioning PSE
    National Instruments

  • Hi, why can´t I see my transparent basic vector after saved?

    Let´s say I draw a circle with the vector tool, stroke 3, white, transparent background, all good until I save. I save as gif or png. If I upload to the website the background is not transparent. It is white. But when I bring a bitmap, edit and save as gif or png is ok. What am I doing wrong with the vector? I flattened, etc.  I´m not so bad. I was able to reproduce the animated football logo? Please help. So basic but I am stuck. Thanks

    Hi
    What color do you expect your backround color to be?
    By opening the optimize panel, you can choose  a.gif, or a .png format and add alpha transparency from the drop down menu.
    But are you sure that it is necessary to use transparency and not a simple jpg image?

  • ImageIO, Timers, Transparency and a Rant

    Hi all,
    <BEGIN RANT>
    After years of writing games and programming next-gen consoles, I foolishly decided to write a few online java games. Like many, I chose Java because it is very similar to c++ and accessible cross-platform on web pages by the masses with no extra downloads required (btw, I'm not using j3d for this reason). Instead I've found it to be nothing but a burden.
    Today's gripes are below. If I sound like I'm having a rant, well, adter fighting it all this week, I am!. Any help/suggestions/discussions are most appreciated. I'm currently using j2sdk 1.4.1_02.
    1) why can't I enable hardware accleration (currently transparency sucks on the framerate) without requiring permissions? What exactly am I going to do with it to make it a security risk?
    2) Why doesn't Java (without j3d or any other extensions) support the high-resolution timer? Such a basic issue that has been plaguing these forums for years. It seems i can't even do a work around as using JNI requires permissions and an external module on the users machine so I can't do this from an APPLET without signing and installing something. Sure, I could just about use 15ms on w2k and make win98 users suffer 20fps with the 50ms res timer. I tried (and succeeded) using my own timers but that means I can't sleep (as sleep>0 will sleep for at least the timer resolution) so my cpu usage is on 99%. And it destroyed my win98 machine :-)
    3) Has anyone had problems with ImageIO.read in an applet? I find that occasionally when reading an image from my web server, it reads it twice (I guess it found an error and re-requests it) and corrupts the image. At first I thought it was my php corrupting the image, but it does it on direct accesses as well. Toolkit.createImage does not seem to have this problem (but then I have to wait for it to load (currently i construct a cursor with the image which seems to sort it) and then paste it to BufferedImage rather than the inaccessible image type returned by toolkit). Toolkit does however have the advantage that I can use extensionless image files and prevent caching though which ImageIO.read seems unable to handle (it usually loses the palette of gifs etc without the extension hint). My web server is apache 2.0.45 on Win2K btw.
    These issues concern APPLETS. If i wanted to write an application, I would use C++ and OpenGL with very little extra work required for each specific platform (I'd only support PC, Mac and x86 based linux anyway). The point is that i want it embedded on a web page, and I don't want the user to have to install anything other than the Java VM.
    What about the rest of you?
    What are your experiences with Java? Good? Bad?
    Do you guys think there is any future for it?
    Does Java 1.5 have anything to offer?
    IF it comes down to it, and I NEED to sign my code and INSTALL code then I will just abandon Java altogether and just stick to windoze users (It will take a fraction of the development time and be far less painful for me to use ActiveX or .net).
    Oh, and if you're just going to reply and say something like "I think Java's the bestest thing ever, it's so much better and cooler than C++ and all my friends think so too. Cos my teacher says so" then don't waste my time.
    If you have games experience and know what you're on about then I'd love to hear your views.
    <END RANT>
    Cheers
    -Simon

    Hi Paul,
    Thanks for your views.
    Badly because it provides the security protections it
    says it does?No, I mostly appreciate the security but some things seem a little too secure. For example, I want to use hardware acceleration for my translucencies but Java won't let me....
    You can always ask users to change the permissions
    your applet runs under, I guess, assuming the browser
    support it. But at that point, why not just write
    your game in C++ and tell people to download it and
    run it normally?
    If you're trying to write a Quake clone in java and
    distribute it as an applet...don't bother.The point is that I don't want to do this. I want ppl to have to visit the web site to play the game without having to grant permissions or install anything or agree to run signed code (not just for advertising purposes but also as some of the games will be multiplayer). This is why I'm using Java and not activeX or .net.
    I am perfectly capable of writing any genre of game to be cross platform in C++ (if I use OpenGL then I only need a thin platform-specific wrapper and I just compile it for my required targets - PC, Mac, x86 Linux), and I really don't see the point in attempting it as a Java application. The games I'm currently doing for my website in Java are simple 2d games suitable as applets. The main problem I'm having is that I need to sleep to free up the cpu, but the low-res timer resolution means that I can only run at 20fps on win98 when sleeping. Yes it runs ok, but by no means as smooth as I'd like.
    I think java may be used in action gaming once it's
    embedded into an environment that really supports it
    (e.g., as a well-supported programming environment for
    PlayStation 2) (and somehow I doubt that OpenGL
    bindings for java will work as a gaming environment,
    although that's just a hunch), and it's fine for
    non-action games.Yes it can do the job, but the fact is that Java is and always will be slower than code compiled directly for the target CPU so this means my entire audience need to have higher spec machines to run it than I'd like. I would not write anything that is too CPU intensive in Java as there are also still a lot of low spec machines out there running win98 which I (unlike most ppl on these forums) am still willing to support.
    Cheers
    -Si

  • Strange transparency problem trying to print a pdf made in Pages

    Hello,
    Let me apologize in advance for this one, I think it's probably a doosy.
    Not having the know-how to efficiently do a layout in InDesign, I have been doing print projects in Pages. It has worked fine in the past to print to Adobe PDF postscript from Pages, open up Distiller and convert to a CMYK pdf from there. For my current project it worked fine for the first 2 renditions, but for the 3rd edition (incidentally the one the printing company printed) looked very "muddy." Fonts were very rough and rasterized badly, and the whole thing looked kind of "brown-washed". They claimed to use a program called "Apogee" which they said told them there was a transparent layer over everything, converting all the fonts to CMYK and thus making them lose their nice crisp blackness.They've also said that it must be a setting in Pages that has caused this, but I can't see how that's possible! When I look at the source file in pages there is no such transparency. Several of the images do have their opacity decreased to varying degrees to help the whole piece blend, however this transparency they say Apogee pulled off the top cannot be seen from my end. We are down to the wire trying to get this job to print again, but I am stumped as to how to ensure that this rendition of the file will be done correctly.
    As perhaps a side note, Distiller seems to be a pretty quirky application for me, often failing to convert on the first or second try, but somehow "magically" working on the second or third attempt for no apparent reason and without doing anything differently.
    Thanks so much for any help - I am definitely cracking out the books and forcing myself to learn InDesign after this nightmare!
    - A-J L.

    They claimed to use a program called "Apogee"
    Agfa Apogee, introduced in 1998, was simply the first prepress workflow product that processed page-independent PDF instead of processing page-dependent PostScript into an internal and intermediate display list prior to marking. The prepress operator would be using Apogee to preflight the PDF you submit as printing master, and the problems found in the preflight are those of which you were informed.
    As perhaps a side note, Distiller seems to be a pretty quirky application for me,
    Adobe Acrobat Distiller has been difficult to configure since the day it was introduced.
    Adobe Acrobat Distiller is a PostScript interpreter that converts page-dependent PostScript page description programs into page-independent PDF page descriptions without programming loops and links that produce dependencies. PDF uses a directory design for the objects in the page descriptions, a design that originates with Xerox Interpress (the idea of a directory is in fact printed on page 390 of "Interpress - The Source Book" by Harrington and Buckley published in 1988).
    converting all the fonts to CMYK and thus making them lose their nice crisp blackness.They've also said that it must be a setting in Pages that has caused this, but I can't see how that's possible!
    It is left to 'application intelligence' whether to match RGB black for type to CMYK black for type or to match RGB black for type to separationBlack for one plate/plane on a four plate/plane offset press. The problem with an offset press is that it lays down the inks at inking units that are some distance apart which invites misregistration. QuarkXPress, PageMaker, InDesign and other authoring applications developed for PostScript imaging on offset presses have ink palettes with four fixed inks of which Black produces separationBlack.
    Pages does not have this possibility, so if you want your type to image as deviceK / separationBlack then you need to select your type and apply the following procedure, or set up styles with the following procedure applied. Open the Apple Colour Palette, select the CMYK slider mode, select some type, select the Advanced icon (the colour space icon on the left), in the drop-down menu select Device CMYK, and set the slider to C0, M0, Y0, K100. This procedure can be used by the typographer to control that deviceK / separationBlack is imaged, but for as long as Microsoft Word has produced type that is RGB, prepress processes have intercepted this in PostScript and reset it to deviceK / separationBlack including Helios colour management for OPI.
    In prepress there is no way to please all parties. If you as developer say, "OK, I'll set the type to deviceK and then show this colour managed on the display" then you will have designers and people who write for designers like Sandee Cohen yelling at you for making type less black than they think type should be, even if in fact the colour of type printed on diffusing uncoated paper is way lighter than they think. And if you say, "OK, I'll set the type to R0 G0 B0 so it is as black as the designers think type should be" then you'll have the prepress crowd with their offset presses yelling at you. Neither party accept that black is a colour, one wants type to be as black as possible regardless and the other party wants type to mark on one and only one printing plate/plane regardless.
    converting all the fonts to CMYK and thus making them lose their nice crisp blackness
    I would wager a box of Carlsberg beer that the prepress operator is ignorant of the controls in Agfa Apogee, and that Agfa Apogee has a control to catch RGB type and convert to deviceK / separationBlack. Try telling the prepress operator to READ the manual -:).
    Several of the images do have their opacity decreased to varying degrees to help the whole piece blend, however this transparency they say Apogee pulled off the top cannot be seen from my end.
    Generally speaking, you should not be sending plain PDF to prepress. You should be sending PDF/X-3 which depends on you as designer flattening transparency, or better still PDF/X-4 which passes the problem of flattening transparency to the application intelligence of the prepress raster processing system. PDF 1.3 does not support transparency nor does EPS and nor does PostScript any language level.
    /hh

  • User exit or BADI for ME22N

    Hi,
    I need a user exit or Badi for defaulting Plant at line item level.
    The scenario goes like this: when ever user tries to change an existing Purchase order and tries to add a new line item in it, I need to trigger USEREXIT or BADI , so that it reads the previous line Plant and updates the current line with the same plant.
    I tried with user exit MM06E005 but it didnt seem to work.
    Can any one please help me?
    I am new to BADI, so I am not sure how to find the BADI and use it. It would be great if anyone can help me i this.
    Thanks
    Ramya

    Hello
    The BAdI ME_PROCESS_PO_CUST is the right one. The method PROCESS_ITEM should be triggered whenever the user changes something in the purchase order at item level and executes any kind of function (e.g. ENTER, CHECK or SAVE).
    However, instead of overwriting the user input via the BAdI you should implement method CHECK ( Closing Check ) where you can validate the user input. If any of your validation fails you can send an error message and set CHANGING parameter CH_FAILED = 'X'. This approach is much more transparent for the user.
    NOTE: In order to "send" an error message you need to add the include mm_messages_mac to your implementing class. For an example see class CL_EXM_IM_ME_PROCESS_PO_CUST (should be available on ERP 6.0):
    METHOD if_ex_me_process_po_cust~process_item .
      DATA: ls_mepoitem TYPE mepoitem,
            ls_customer TYPE mepo_badi_exampl,
            ls_tbsg     TYPE tbsg.
      INCLUDE mm_messages_mac. "useful macros for message handling
    * here we check customers data
      ls_mepoitem = im_item->get_data( ).
      IF ls_mepoitem-loekz EQ 'D'.
    * check field badi_afnam
        IF ls_customer-badi_afnam IS INITIAL.
          mmpur_metafield mmmfd_cust_02.
          mmpur_message_forced 'W' 'ME' '083' text-003 '' '' ''.
        ENDIF.
      ENDIF.
    ENDMETHOD.                    "IF_EX_ME_PROCESS_PO_CUST~PROCESS_ITEM
    Regards
      Uwe

  • Font Rendering Got Bad After 32.0 Update

    After the recent update, the font started to get badly rendered. Kind of blurry with jagged edges and transparent parts. It used to be okay D:
    Sometimes it becomes okay for a moment as I scroll down on pages, and sometimes when I stop scrolling, the font stays as it is; as if there are specific positions for the text to render properly. However, not all of the fonts are affected. Only the ones inside the scroll area.
    On times that the font renders properly, the bookmarks font, title bar, and tab names are unaffected and still appear not-smooth. Same thing does not happen on other browsers. All my other system fonts look okay; it's really just Firefox.
    Restarting Firefox = didn't work
    Restarting Computer = didn't work
    Tried setting up Clear Type = didn't work
    Tried enabling and disabling Hardware Acceleration on Firefox = didn't work
    Tried enabling and disabling the Graphics card itself = didn't work
    Tried just about all of the suggestions I found on Google = none worked
    Tried complete uninstall and re-install of Firefox 32.0 = didn't work
    Graphics Driver are updated ~
    Windows is updated ~
    I'm using Windows 7 Ultimate 32 bit
    Graphics AMD Radeon HD 6800
    I fixed it by downgrading to Firefox 31 which makes me sure that 32.0 is the culprit. As I was about to send the message on top; I tried downgrading as my last resort and it worked. Now I'm only sending this as a report. I am hoping you'll figure out what caused it because I am quite strict (OC) that everything in my PC must remain updated >_<; thus I want to use Firefox 32.0 as much as possible.

    No one to help on this ??????

  • Why saving as transparenct Gif in photoshop gives bad output

    hi the problem i am facing from long time is ..when i save as or even save for web my image to get the transparent image the output in IE browser gives very bad.. it gives broken edges even if i save in maximum quality . kindly help me here or email me at [email protected]

    Boy, did you come to the wrong place...this is the Lightroom Feature Request forum...nothing to do with Photoshop (well directly anyway). You'll need to wander down the hall...

  • "Stuck" Image in Transparent Background

    After Effects 11.0.2.11
    Mac OS 10.7.4
    A "ghost" frame from an old movie is stuck in the transparent / checkerboard background.  Always the same movie / frame.   Even in a new comp without any layers this old quicktime movie shows up.  Was a moving movie, this ghost effect is now just a single still frame.   Put one item in the comp and it disappears.  Turn off the eyeball for all of the layers and the ghost image comes back.
    Does not show up in final renders .. yet.   Mostly in just (Auto / Half) rez comp viewer previews.
    I've tried deleting prefs.    Cleared disk cache.    Cleared Conformed Media Cache.  Restarted the program.    Restarted the computer.      It's been like this for weeks.
    Now it has started showing WITH other layers "on".  This is new.   Makes it tough to work.
    IT department says they have no idea, that "we'll have to wipe your entire computer" but before going THAT route wanted to see if anyone else has ever had this problem and if they've discovered a solution.
    Thanks,
    Marc
    In these examples below this "rooftop scene" not only isn't supposed to be in the image, it's not even in the AE project file.       
    Here it is actually with other layers set to "on" which has never happened before:

    Hello People out there!
    I am facing the same issue
    Since a few days I have a very annoying ghost issue in After Effects. (cc 12.2.1.5)
    I really have no clue what the problem is and how I should solve it.
    Sometimes, when I set the opacity of a layer to 0% I can still see it. It is like a ghost. Even if I turn the little eye off or delete it, the Footage/Ghost won't go away.
    I also tried to Empty the cache (while disabling refresh) and cleaned the database and everything, but the problem is still there.
    (Also when I restart After Effects or my whole computer)
    Sometimes it even shows me, that my ghost is part of another footage (while soloing layers), but I already checked that footage while revealing it in composition and checking the footage in the explorer. I also tried to safe it to a different filetype and relink it, but it was still not working.
    Here a a few pictures of my problem below:
    https://dl.dropboxusercontent.com/u/.../ae_issue1.jpg
    https://dl.dropboxusercontent.com/u/.../ae_issue2.PNG
    https://dl.dropboxusercontent.com/u/.../ae_issue3.jpg
    https://dl.dropboxusercontent.com/u/.../ae_issue4.jpg
    Sometimes, when I scrup with the slider through the composition it appears and disappears and when I loose my click, it is sometimes there and sometimes not.
    It is even in my rendering, which is really bad because, because the video is due tomorrow :S...
    Das anyone of you know if I did something wrong or if it is a programm bug?
    I am really desperate right now!
    my machine: HP Z220 Workstation
    Windows 7
    Intel (R) Core(TM) i7-3770 CPU @ 3,4 Ghz
    16 GB Ram
    Xeon E3-1200 v2/3rd Gen
    All drivers are up to date.
    Sincerely,
    Mandy

  • Badi or user exit for FTR_CREATE for modifying the Payment Details

    Hi,
    I need a Badi or user exit that will modify the internal table for the "Payment Details" in transaction FTR_CREATE. Basically, instead of using the default entries in the "Payment Details", data from a Z custom table will overwrite the "Payment details" or the internal table containing the "Payment details". This should create entries in the Transparent Table VTBZV with the values fron the Z custom table.
    I'm trying to implement the BADI FTR_TR_GENERIC but I cannot find where and how to modify the original "Payment Details" entries. Or if its even possible to do this.

    Hi Ravi,
    I was trying to do that but I am not sure were to start. I tried to modify the contents of PI_PROXY_TRANSACTION->A_TAB_CASHFLOW using MODIFY <itab> but an error occured when I tried to activate it saying that the Class/Interdace Attribute is read-only and cannot be modified. Is there a Method that I need to do in order to modify the Attribute? (I'm not that familiar yet with Classes and Methods though)
    Also, just to test, I manually changed the contents of two internal tables containing the "Payment Details" data during debug mode. But after the program has ended its run and finally saved the data, the Table VTBZV was not updated with the manual data I introduced. So I'm not sure if the Badi FTR_TR_GENERIC (which is I'm implementing) really allows the "Payment Details" to be modified or if I'm modifying the correct internal table(s).

  • Problems exporting PDF in CS5, text missing, images not transparent....

    I am using InDesign CS5 v7.04 on a Mac running 10.8.1.
    (This exact problem has happened to me in previous versions of ID as well though?)
    When I export a document to PDF and view in Preview OR place into another ID document, every other page has an issue where some or all of the text is missing and images that should be transparent are not or are missing as well.  Actually, the first two pages in my current document are OK, the rest, every other page is missing text, images, has non-transparent images that should be transparent... The font I am using is Avenir and the imaages are .png created in PS. The images are in the master.
    WHY is this happening? What do I need to do to fix?
    Thanks in advance for your assistance. I am a mostly self-taught ID user since version 1, just getting back into using it after a long break.

    pixelpusher_mama wrote:
    OK, sorry, yes, it looks the same in Acrobat as it does in Preview as it does when placed into ID. 
    I was able to save as a non-current version of PDF (Acrobat 4) and it shows correctly. (???) What am I missing, here? I mean, I have a work around, but?
    So the original PDF is bad, rather than there being a problem with the file after placing. That implicates the original document, and since Acrobat 4 compatibility works it's probably either a transparency or layer issue.
    Try trashin g your prefs and export again and see if it works. See Replace Your Preferences

  • PDF shows multiply-transparent jpgs, but does not print them

    Hi all, i have to print some store vouchers with different logos, but too bad some of them are jpgs. I've set them to "multiply" with 60% opacity, and when I export to PDF, it actually shows pages as it should, but when I print them, these logos magically don't appear. What the heck am I failing? Attached screens of my working profile. Thanks!

    While I think your boss is making a big mistake and sacrificing quality in the output, he is the boss. Make a new PDF preset -- start with X-1a, then change the standard to None and check the box to simulate overprint. Does that work? Do you see thin white lines (stitching) around transparent areas? That's a major problem on screen and with digital printing from flattened PDFs.

Maybe you are looking for