Pb with time-based disagregation

Hi!
In my parea ZPA_REG, i have 2 dataview. One is by week and the second one is by month. I enter 100 at month level for October 2007. Then, i go in the second dataview (week level) and i see
S40  25
S41  25
S42  14
S43  25
S44  11
My question is about S41 ad S44. For S44, since there are only 3 days for october it could explain the result. But i don't understand why for S42 the value is 14 .
The Time Stream ID (transaction /sapapo/calendar) is based on US calendar. Its calculation type is with gaps.
The storage bucket profile is by Day, Week, Month and year.
The time based disagregation for my KF is type P.
Can you help me please.
Thanks by advance,
LB

1. check if there are any holidays in that week. looks like it could be 2 holidays there.
2. zero out all the values in the weeks and re enter 100. sometimes if there is a value already existing it will follow that proportion by default and not your planning area disaggregation type

Similar Messages

  • Problem with time based rman recovery

    Hi and happy boxing day:
    I am following by the letter the instructions in Sybex FundamentalsII book p.348 regarding an rman time based recovery. Here is the syntax:
    RMAN> run {
    2> allocate channel ch1 type disk;
    3> set until time 'DEC 26 2007 10:30:00';
    4> restore database;
    5> recover database;
    6> sql 'alter database open resetlogs';
    7> }
    and here is the error:
    allocated channel: ch1
    channel ch1: sid=11 devtype=DISK
    executing command: SET until clause
    released channel: ch1
    RMAN-00571: ===========================================================
    RMAN-00569: =============== ERROR MESSAGE STACK FOLLOWS ===============
    RMAN-00571: ===========================================================
    RMAN-03002: failure of set command at 12/26/2007 11:06:45
    ORA-01858: a non-numeric character was found where a numeric was expected
    Is this a format issue? Any tips please!
    Thanks,
    DA

    You can only register a database once if you are using a recovery catalog, so whoever or whatever told you to re-register is simply getting their terminology in a knot!
    There **is** a sort-of "re-register" command which does sometimes need to be issued after a resetlogs: it's called a RESET command. See http://tinyurl.com/29t5lr for details. A "reset" is, effectively, a "re-register".
    One of the details you will see is that you only need to do a manual reset if it was **you** that issued a resetlogs, outside of RMAN. If you get RMAN to do the resetlogs, however, then there's no need to also issue the reset command, because RMAN already knows about it (and has already updated its own metadata accordingly).
    The other thing you'll notice from the documentation is that if you aren't using a recovery catalog then the concept of issuing a reset is entirely redundant. That's because you can't recover to a prior incarnation of the database without a catalog: it's one of the main reasons for wanting to run in catalog mode in the first place. No catalog = no ability to step through incarnations = no need to inform RMAN that you're in a new incarnation.

  • Problem in Time Based Publishing Content

    Hi every1,
      Im working with Time based publishing.
    Using xml form builder i created 3 contents means 3 xmls.
    Then i created one iView for reading the contents (KM Navigation iView) and i setup the properties like
    layout set, layout set mode, root folder of contents.
    After creation of iView i checked in the preview all 3 contents visible in my iView.
      Now i want to show time based content in that iView.
    Contents displayed as per time based
    for that i enabled time based publishing and life time of particular content(xml)by using the given way
    Time dependant publishing in Km. I clicked on the right hand side of the name of my folder-> go to details -> Settings -> Lifetime. there you have to enable the time dependant publishing. Then i opened the folder and click on the rt hand side of the document-> properties -> lifetime, here give the time span of the document.
    After life time setup , again i seen in the iView for reading the contents (previous created) in the preview
    again all 3 contents displayed including life time expired content also.
       Please give me solution for this, or any more configurations required.
    Note :
    I required to display the contents in between time applicable only ( from time and until time).
    Thanks in advaince
    Vasu

    I have waited more than 3 hours for settings to apply.
    But i couldn't find any changes.
    any other solution?
    Thanks
    Vasu

  • Time-based publishing stopped working

    Hi,
    We currently have a problem with time-based publishing in KM. Since a few days ago, documents stopped becoming visible once they reach their "valid from" date. We have not been able to publish documents with TBP since then on that system.
    These errors keep appearing in the knowledgemanagement.#.log files, which seem related to this issue :
    #1.5#C000AC10005900130000012D00000B3400040F325EE040B8#1142608921379#com.sapportals.wcm.WcmException#irj#com.sapportals.wcm.WcmException.WcmException(62)#System#0#####ThreadPool.Worker1##0#0#Error##Plain###application property service not found com.sapportals.wcm.repository.service.timebasedpublish.wcm.TimebasedPublishException: application property service not found
         at com.sapportals.wcm.repository.service.timebasedpublish.wcm.TimebasedPublishServiceManager.getApplicationPropertyService(TimebasedPublishServiceManager.java:589)
         at com.sapportals.wcm.repository.service.timebasedpublish.wcm.TimebasedPublishServiceManager.setValidEventSent(TimebasedPublishServiceManager.java:540)
         at com.sapportals.wcm.repository.service.timebasedpublish.wcm.TimebasedPublishServiceManager.handleVisibleResources(TimebasedPublishServiceManager.java:327)
         at com.sapportals.wcm.repository.service.timebasedpublish.wcm.CheckValidFromSchedulerTask.run(CheckValidFromSchedulerTask.java:65)
         at com.sapportals.wcm.service.scheduler.SchedulerEntry.run(SchedulerEntry.java:128)
         at com.sapportals.wcm.service.scheduler.crt.PoolWorker.run(PoolWorker.java:107)
         at java.lang.Thread.run(Thread.java:479)
    The KMC version is SP2 with Patch level 29 hotfix 1, and is running on Windows Server 2003 with an Oracle database. We have opened an OSS message but while we are waiting I thought I would post this here in case anyone ever experienced this.
    Best regards,
    Olivier

    Hi,
    1.  Have you checked that tbl service continue assigned to your repository ?
    2. If you create a new repository and assign these service, does it work ?
    Enables users to define a time frame during which documents are published (visible).
    Note that the time-dependent publishing service requires the application property service.
    This service cannot be configured.
    Patricio.

  • Time based animation

    Hello,
    I'm looking for opinions on which is the best way to animate your game for andoid and/or iOS with time based code.
    I have some knowledge of the subject, but have not tested the various methods on my mobile device.
    Some options i know of would be using an ENTER_FRAME event ,calculate delta-time, and used either a fixed or variable time step. (Or perhaps variable but with a limmit) Others have mentioned doing the same, but with a Timer calling the function rather than ENTER_FRAME.
    Also i have read of setInterval() as an option. Some reccomend to updateAfterEvent(), some dont.
    I think i may remember it was reccomended not to use timers with flash mobile??? I might be wrong about that.
    thanks for any input.

    What's wrong with a tweening engine like TweenLite / TweenMax... it's what I use.

  • Time based aggregation - partition

    I have a question regarding partitioning of an InfoCube.  Can you use logical partitioning on an InfoCube designed with "time-based aggregation"?

    conversion of minthly and weekly time buckets: If you have overlaping time buckets in your planning area, data is stored in storage buckets that have the lowest granularity of the planning buckets. the overlaping periods are considered as a separate storage bucket
    21May -27May - 7 days
    28May -31 may - 4 days
    01 June -03 June  - 3 days
    04June -10 june - 7 days
    You can have a temporary mixed bucket display in the planning book by using the Planning Bucket Profile : 9ASTORAGE. You can do this change from the interactive planning book. If you can tell what version you are on we can help with how to reach there. As far as i know, This display cannot be made permanent
    if you want a mixed time bucket for permanent display you need to create fiscal year variants in customizing and create your time bucket and planning bucket profiles accordingly.

  • Could Time Capsule be able to work connected to an Alpha R36 router with 3G based wifi net?

    Hello everyone from Spain.
    I live in a rural area and we don't have any internet connection apart from 3G. I bought an Alpha R36 3G-wifi router and I was able to create a 3G-based wifi networkat home. Despite, when I tried to connect my Time Capsule to the router to enhace my wifi network I wasn't able to do it. I had no problems creating my own wifi network with Time Capsule in the city, but I can't use it at home. Could you help me? I don't know if it's possible to do it and how to do it.
    Here is the link of the router:
    http://www.alfa.com.tw/in/front/bin/ptdetail.phtml?Part=R36&Category=0
    Thank you very much!

    Make sure you plug the TC into the LAN port of the alfa. Bridge the TC so it does not route, leave that to the alfa, and just use the TC for wireless network. Since you are doing that, turn off the wifi in the alfa and just use the TC. That should then work fine.

  • Assembly Point with mix of time based and discrete components

    Hi all,
    I have a scenario where I want to assemble 2 different components - one of them time based (assy data EXTERNAL_LOT), the other one discrete (assy data INV_SFC).
    Resource system rule "time based resource (genealogy)" is set to true.
    I set up and loaded slot config on resource for the time based component.
    Operation is configured to not allow SFC complete if components are missing (CT520 on PRE_COMPLETE hook)
    When I start SFC at operation and resource in POD the time based component is assembled automatically (according to device history record) However, once I open assembly activity (COMP_LIST_DISPLAY, CT500) to assemble the discrete component both components are displayed as not assembled in component list. If I assemble the discrete component only and then try to complete SFC I receive an error message pointing out that not all components are assembled yet.
    I would have expected that time based component is displayed as already assembled and also CT520 hook does recognize the time based component as already assembled.
    Am I missing any settings?
    Or does such a mixed scenario not work at all?
    We are using ME 6.1.4.2
    Thanks for your help!
    BR Ulrike

    Hi,
    The issue you are seeing with CT520  whereby the hook does not recognize the time based component as already assembled was resolved in a later release.  Please patch to latest version.  Associated release notes will detail which patch fixed it.
    You would need to use assembly point in POD to add the INV_SFC
    Thanks
    Steve

  • Min max lot size time based for use with SNP and PPDS

    Hi all, Is there anyway to set up time based min and max lot sizes? ie we want to have a Max lot size which is small for use with the first 3 months of our plan which starts life in SNP and then converts to PPDS and into blocks for block planning, but months 4 to 36 we want to leave as large max lot sizes as there is no need to have the small max lot sizes for the different horizon.
    As far as I can see there is only a material/plant lot size  and Max lot size and no way to have a different setting in a different time period.
    Thanks
    j

    Hi John,
    As you know, in the product master, the lot size maintenance is time-independent, so that obviously can not be used for your scenario. As per my understanding, to meet this using standard functionality, you can maintain multiple product specific t-lanes (for STRs) and PDSs (planned orders) with required lot size ranges and validity dates (for short or long term horizon). But, again since the validity of t-lanes and PDSs will not be automatically rolling forward so the updating the validities will be a challenge.
    The other option could be to enhance the heuristic functionality at lot size selection step while creating order.
    Regards,
    Umesh

  • Time based publishing with WebDynpro ?

    Hello All,
    I need to validate the documents from KM before showing them in my Web Dynpro application depending on the Time based publishing properties.
    I read the documentation on SDN for the same and tried using this example -
    <a href="http://etower.towersemi.com/irj/portalapps/com.sap.portal.pdk.km.repositoryservices/docs/repositoryservices.html#timebased%20publishing">http://etower.towersemi.com/irj/portalapps/com.sap.portal.pdk.km.repositoryservices/docs/repositoryservices.html#timebased%20publishing</a>
    However which JAR file or some other references do I need to make for accessing this class - ITimebasedPublishServiceManager
    Please help.
    Thanks in advance,
    Regards,
    Samta

    Hi everyone,
    I figured it out... We need to include the following -km.shared.repository.service.timebasedpublish_api.jar for this.
    Thanks,
    Samta

  • Sales orders in TDMS company/time based reduction  are outside the scope

    Guys,
    I have had some issues with TDMS wheras it didn't handle company codes without plants very well. That was fixed by SAP. But I have another problem now. If I do a company code and time based reduction, It doesn't seem to affect my sales orders in VBAK/VBUK as I would have expected. I was hoping it would only copy sales orders across that have a plant which is assigned to a company code that was specified in the company code based reduction scenario. That doesn't seem to be the case.
    VBAK is now about one third of the size of the original table (number of records). But I see no logic behind the reduction. I can clearly see plenty of sales documents that have a time stamp way back from what I specified in my copy procedure and I can see others that have plant entries that should have been excluded from the copy as they do belong to different company codes than the ones I specified.
    I was under the impression that TDMS would sort out the correct sales orders for me but somehow that doesn't seem to be happening. I have to investigate further as to what exactly it did bring across but just by looking at what's in the target system I can see plenty of "wrong" entries in there either with a date outside the scope or with a plant outside the scope.
    I can also see that at least the first 10'000 entries in VBAK in the target system have a valid from and to date of 00.00.0000 which could explain why the time based reduction didn't work?
    Did you have similar experiences with your copies? Do I have to do a more detailed reduction such as specifying tables/fields and values?
    Thanks for any suggestions
    Stefan
    Edited by: Stefan Sinzig on Oct 3, 2011 4:57 AM

    The reduction itself is not based on the date when the order was created but the logic enhances it to invoices and offers, basically the complete update process.
    If you see data that definitely shouldn't be there I'd open an OSS call and let the support check what's wrong.
    Markus

  • A series of problems with Time Capsule

    I was an early adopter of Apple's "Time Capsule," which in theory sounds like a great idea but has been a disaster for me in practice. This is the story of my nightmare.
    In theory, Time Capsule is supposed to enable wireless, automatic backups of my hard drive via wifi. I liked the idea because I thought it would save me time and make backups so convenient that they'd be sure to happen.
    In practice, it's been slow, aggravating and buggy as ****.
    After I bought the device, I brought it home and set it up to do the initial backup of my laptop computer. My first mistake, as I learned later, was to try to do the initial backup wirelessly. The Time Capsule has an Ethernet connection that works faster than the wireless connection. (It ought to have a USB or Firewire port, which would be far faster than Ethernet, but apparently the Apple gods decided to save a nickle at the expense of quality and convenience.) It would also have been nice if the manual that came with the thing recommended using Ethernet for the initial backup, but it didn't, so I didn't discover that this was even an option until I had already wasted several days.
    Yes, days. Not hours, but days.
    I started the initial backup on a Wednesday night. My laptop's hard drive had about 130 gigabytes of files on it, and after the initial backup had run for a couple of hours, I did a calculation based on megabytes/minute and figured that it should finish up sometime that weekend. This meant that I couldn't take my laptop away from the house or turn it off, but after I'd already invested a few hours in the process, I figured I'd just let it run until completed.
    Unfortunately, the longer the backup proceeded, the slower the megabytes/minute rate became. By the following Monday morning, it was only 2/3 completed, and the copy rate had slowed to such a crawl that I had no idea when it would finish. I therefore reluctantly interrupted the process, since I had meetings to attend where I needed my computer.
    By that time, I had done some further research online and learned that people were recommending doing the initial backup via Ethernet, so when I got home that Monday night, I decided to do it that way instead of via wifi. However, it was unable to resume the interrupted backup, so I had to start over from scratch. Over the course of several attempts to do this, I discovered moreover that the interrupted backup had corrupted the disk somehow, so eventually I had no choice but to erase the Time Capsule entirely before beginning a new backup. Each of these attempts took a half hour or an hour, so I wasted my Monday evening doing nothing but try and retry to start the backup. (Somewhere in the middle of this I also did a tech support call to Apple, which also meant time on the phone, sitting on hold, etc.)
    Finally, sometime on Tuesday I got the backup started, and by late that evening I had my first incremental backup. Hooray! Or so I thought.
    The thing ran OK for a month or so, and then for no apparent reason I discovered that backups were failing. Why? No idea. I called Apple tech support again. An hour or so on hold, then talking to an operator, then trying various things. Eventually the tech support guy told me I'd have to erase the hard drive again and do a new initial backup. Great. At least I knew by then that I'd need to do it via Ethernet, but of course it took the better part of a day for the backup to run, and starting over meant losing all of the history in my previous backups. But I did it.
    After that, I had a good run for several months...maybe six months. Then, for no apparent reason, the backups started failing again. This time I managed to get the backups working again by unplugging the Time Capsule, plugging it back in, and doing some reset procedures.
    A month later, the Ethernet connection started failing.
    I have an old iMac upstairs that I plugged in to the Ethernet connection so it can access the internet and file-share with my laptop. One day I noticed that the iMac's internet connection was no longer working. After some testing, it turned out that neither the iMac nor my laptop is able to connect anymore through the Ethernet connection. I tried several cables, tried playing around with settings, to no avail. I had a trip to Hong Kong coming up, so I decided I'd worry about the Ethernet connection after I got home.
    On the Saturday before my Monday morning flight to Hong Kong, my laptop died.
    It was working fine on Friday evening. I went to bed, got up, and the screen was black. In a panic, I drove to the Apple store and, after much pleading, got them to look at the thing in a hurry. Their tests showed that the logic board was bad. Fortunately, they had a replacement in stock, so they were actually able to repair the laptop in only two hours.
    Since then, however, I haven't been able to get backups to work at all on my Time Capsule.
    When the laptop died, I figured I might need to restore files from the Time Capsule, so I unplugged it and took it with me on my frantic drive to the Apple Store. After I brought it home and plugged it back in, the laptop recognizes that it exists, but instead of doing an imcremental backup, Time Machine wants to start all over with a backup of more than 140 gigabytes.
    No ******* way.
    My plan at this point is to make an appointment at the Apple Store and take it in to someone at their Genius Bar. Maybe they can figure out why Ethernet isn't working and why the incremental backups aren't happening. In the meantime, I don't know if any of the backups that it has done to date are any good, so it's an uneasy feeling. And, of course, I've wasted more time than I care to think about just tweaking and nursing the thing.
    It just isn't worth it.
    If anyone has any suggestions for what I should be trying at this point, I'd love to hear advice.

    I've got some time here as I wait on my computer (more on that in a moment), so in the meantime I re-read Smokerz's reply to my message. Upon re-reading, I thought I'd respond again.
    Smokerz is saying basically that Time Capsule is great for purposes OTHER than backing up my computer's hard drive: "Make your primary TM backup using an external FW800 drive first. ... I don't do TM on TC but use TC as my internet file server for my family who all lives far from one another and stream movies to my ATV. Much better use of TC, eh?"
    There might be a case for that use of Time Capsule, but the product is advertised and sold primarily as a wireless backup device. The main page for the product on apple.com describes it as "Automatic wireless backup for your Mac. ... Time Capsule is a revolutionary backup device that works wirelessly with Time Machine in Mac OS X Leopard. It automatically backs up everything, so you no longer have to worry about losing your digital life."
    Smokerz is therefore arguing that I should be happy with a product that doesn't work for the purpose for which it is primarily advertised and for which I bought it. I don't have any use for it as an internet file server or to stream movies. To say I should be happy with it for those purposes makes no sense.
    Here's a little update on my experience with this thing. I made an appointment and visited the Genius Bar at the Apple Store yesterday. My appointment was at 10:40. I spoke with *** ****, who by the way is "Lead Genius" at the West Towne story in Madison, WI. I've got no problems with him. Like every other Apple employee I've dealt with locally, he was competent, courteous and and helpful. He quickly fixed my Ethernet problem, but the more important problem -- the failure to connect to my existing backups -- left him stumped as it did me. The problem apparently is that when Time Machine uses the MAC address of your computer's logic board as part of its way of identifying your system. There is a file on the backup drive that stores this information. It's filename consists of a period followed by the old MAC address. There is a procedure for renaming that file and changing some other settings which is supposed to make the backup work again. The procedure is detailed here:
    http://www.macosxhints.com/article.php?story=20080128003716101
    http://discussions.apple.com/thread.jspa?messageID=6893237
    The problem in my case is that the file was missing when I went looking for it. (I hadn't been mucking around on my Time Capsule, so I don't see any way that I could possibly have deleted it.)
    After some scratching his head, *** asked me to leave my computer and Time Capsule with him at the Apple Store and come back at 2:30 that afternoon. When I returned, not much progress had been made. He had found a way to make Time Machine do another initial backup from scratch within the same sparsebundle as my older backups and thought that once that backup completed, there would be a way to connect up with the the older backups. By 2:30, however, the backup had only completed 4 GB out of the 130 gigs on my hard drive. The Apple Store closes at 9 p.m., so I would have had to leave it with them overnight just to get through that stage of the process. I think I mentioned previously that I live an hour's drive from the Apple Store, and since I had already burned another entire day trying to get this thing working, I had *** give it back to me so I could take it home and try to complete the procedure myself.
    Once I got it home, I let the backup run overnight. It took about 12 hours to complete. Now I'm going through the procedure *** gave me, which is documented in a little more detail here on Sean Kelly's blog:
    http://seankelly.tv/blog/blogentry.2008-08-25.8041499927
    I just finished spending an hour waiting for the computer to open my sparsebundle (step three of Sean Kelly's procedure). Only six more steps to go!
    <Edited by Moderator>

  • Calculation of SLA times based on Service Organization

    Is it possible to calculate the SLA times based only on Service org?
    a) Using Service contracts i.e create SC with only org and assign the Service & Response profiles.
    Else as mentioned below.
    Please give your more thoughts.
    I maintain the Service & response profiles at "Maintain Availability and Response Times" .
    Can I access these values directly in the BADI ?
    My scenario is
    a) An agent belongs to a service org.
    b) I define these Profiles seperately for each Org (Org1 Org2 etc) at the above tcode.This manual entry.I know we dont have org to profiles mapping in the above tcode.I just painly maintain.
    c) In the BADI i check the org entered in the complaint.
    d) for Ex if the Org1 is entered I want to access the profiles for Org1.If else ladder.
    e) then use these profiles to calculate the SLA times.
    f) then save the document.
    g) Also trigger an e-mail saying the above time lines.
    Is the above flow possible??
    Let me know if you want me to post this onto another thread.
    Thanks
    amol

    Shalini,
    I will be just maintaining the service and response profiles in the "Maintain Availa..." tcode.
    There wont be exact mapping stored in any table.
    My logic would be ,i dunno whether this right or wrong..
    1) Once i get the Org ,I would compare like ths
    if( orgdata = org1)
    then service profile 1 etc.
    2) then apply the profiles to the cal of SLA times.
    I think we can achieve what you said using CRM_ORDERADM_I_BADI
    Or we need to use the BADI's specifically mentioned for serv contract det and calculation of SLA.
    As you know in SAP for SLA times we need to have the service contract for a) customer b) org  and many other parameters.then to this SC we need to associate the service and response profiles.When the SC is determined in a complaint ,the serv and resp profiles will be used to get the SLA times.
    But my requirement is to have determine the SLA times based on the service organisations.Not based on the customer and any other parameters.
    For ex : If my serv org is in India the times would be diff ..if my serv org is in US the times would be diff.
    So let me know what approach would be best ?
    Use BADI's as above or does this approach of having define different Service contracts (without having Partner functions customer etc) for diff orgs?
    Thanks
    Amol

  • Backing up an MBA with Time Machine?

    How does one back up a new MBA with Time Machine?
    I just bought a drive with the new Firewire 800 connections for my iMac.
    Do I have to buy another one with a USB connection or can I do it wireless?

    Don't underestimate how amazing Time machine is --
    After the initial backup, it will wirelessly backup your system every hour -- it only takes a few minutes and basically you are unaware of it even happening except for the sync icon turning in your menu bar.
    Acccidently erase an e-mail?...Go back a day, and viola it's back.
    Throw out a bunch of photos from a disc and then empty the trash? Go back a few days and there it is.
    I find I use it far more than I ever thought I would -- and ironically, because Macs basically just plain old work, I've never in 8 years had to go look for a backup of files or anything that accidently disappeared the way things do on Windows based computers.

  • Problem with File Based replication in Weblogic Express 10

    Hi,
              We have Web application (exploded war) file deployed on Weblogic Express 10, to a Cluster of three Managed Servers (all three on different physical machines).
              We are using File based session persistance in weblogic.xml
              We have a shared location for all the three servers where we will be sharing the Session data.
              When we start the application, its works fine and is very fast, but after sometime the application slows down.
              Troubleshooting the Issue we found that its a problem with file based replication. By using File based replication every user session is stored in form of directory inside shared directory. So after sometime thousands of directories are created inside the shared directory where the session information is stored. So when we access the application, its waiting for lot of time with Message Session Monitor .... (this is because its browsing through the shared session storage directory for lot of time for session information as it has lot of directories) and finally after a long time like 10 mins we get the Application Home Page.
              When we clean up all the saved sessions inside shared directory, the application works fine, But we will see the same sometime later may be after 3 or 4 hours when the shared session directory has lot of session information stored in it.
              Is there a way to clean up the saved session information on file system as soon as that user session is closed by using file based replication.
              We cannot used Inmemory replication as our Appl doesnt support it.
              Please advice as it is a major show stopper in our Production Mirror env.
              Weblogic Consultant

    It is possible to reduce number of live session by configuring very low timeout-secs weblogic.xml. Default is 60 minutes.
              More details are here..
              http://e-docs.bea.com/wls/docs100/webapp/weblogic_xml.html#wp1071982
              Jayesh
              Yagna Sys

Maybe you are looking for

  • How to create a Picture/Image field in a PDF form ?

    Hi, I'm creating a PDF form including several fields (Text box, Checkbox, radiobuttons,...). My users will enter there personal data in the PDF form and send it back to me (or print it). I would like to create a special field for picture of the user:

  • Trial InDesign, Page jumps around.

    I just recently got a new iMac, 10.6.8 Snow Leopard.  Upgraded from 8-year old Tiger 10.4.11. Also was using InDesign 2.0.  Everythings is so different, but then in computers, 8-years is ancient history! Anyway, trying Trial Version of CS 5.5.  Impor

  • Trigger Process Chain through the Web

    Hi, I am trying to allow the users to control when they trigger a process chain through the web. The users have a number of integrated planning sheets, which they access and input, once they are happy with the inputs, then they want to trigger the pr

  • I have photoshop 5.5 and can't open .nef files.  I am looking for the old plugin can anyone help?

    I am looking for the photoshop 5.5 plugin to allow me to open my .nef camera files... can anyone help?

  • Troubles with dell vostro 1400

    recently buy a dell vostro 1400 with vista, first install ubuntu and then with internet install arch, but something strange happen with arch, the processor fan work too much, ok, runung kde s normal, but with an base install(no X,kde) the fan work at