Servlet Chaining and OAS 4.0.8.1
I am using the Request Dispatcher Concept for my application where a servlet either initiates a bean or calls another servlet. I believe I cannot test it from JDeveloper.But does OAS 4.0.8.1 support servlet chaining. Very Critical. if yes? Please let me know where I can find the necessary documentation.
Thanks
More info ...
I enable logging and found the following error
Unable to initiate threads: cannot find class java/lang/Thread.
Similar Messages
-
Process Chain and e-mail configuration
We have configured a process chain e-mail message recipient by keying in the full recipient e-mail address in the message screen. This works successfully.
However, we would prefer to use the SAP User ID in the process chain message configuration. (The sap User Id would contain the external e-mail id)
Why: Because we do not want to have to reconfigure the process chain every time a user or an e-mail changes. We would like to use a generic user in the process chain configuration, and then only change the generic user e-mail when changes occur.
When trying to use the sap user id in the process chain no mail message is received.
We would like to seek input from folks in this forum who may have worked with a similar scenario and have developed a solution, or could provide suggestions on how to make this work.
Thanks in AdvanceBarbara,
Mine all work fine on WLS or Jserve, maybe you should include a code snippet??
Barbara Singer wrote:
> Hello:
>
> We have some simple servlets that need to process forms and bundle this
> information up in a SMTP message. This is very basic stuff but we can not
> seem to get this to work with WebLogic AppServer 4.51. I've built the
> sample servlet /examples/servlets/MailServlet and this does not work either.
> It also throws now exception. I can see it connect to the SMTP server ans
> send the information but nothing comes through.
>
> I also read in your documentation that /utils/sendMail will be a deprecated
> method in a future release. Is it still supported in 4.51?
>
> Any input, sample examples working, etc. is welcome.
>
> /bas
Russell Castagnaro
Chief Mentor
SyncTank Solutions
http://www.synctank.com
Earth is the cradle of mankind; one does not remain in the cradle forever
-Tsiolkovsky
-
Create Process Chains and add Process Types using ABAP
Does anyone in here have experience in creating or changing a full Process Chain including the Process Types?
The reason is, that we have a lot of source systems with "similar" loads.
We already have an ABAP that can copy the chain and replace the InfoPackages. But we can't find a way to create/change a process, e.g. 'Hierarchy Save', 'PSA Delete' etc.
Any ABAP samples is highly appreciated!
Best Regards JakobHello Jakob
did you find already this how to paper: "How to ... Implement custom process types" ?(https://www.sdn.sap.com/irj/servlet/prt/portal/prtroot/docs/library/uuid/72e0e1ed-0c01-0010-74bc-b397c8c4dadc)
It has a code sample in the appendix.
Hope it helps,
regards
Martin
Message was edited by: Martin Lehmann -
How to run servlet in and out JDev.
Hi,
I have developed a simple servlet through wizards.Since i'm new to the servlet in JDev. i'm facing problem to run the servlet in and out of JDeveloper.
Even i run as per the help documentation,
i'm getting this error
java.io.IOException: CreateProcess: cmd.exe /C start "" "C:\JDEVELOPER3.0\public_html\WebAppRunner.html" error=0
I would be gratefull if anyone assists me.
nullI am also not able to take a simple tutorial application from Jdev 3.0 and deploy it on Oracle App Server 4.0.8.1.
I ran the tutorial which creates a Jservlet accessing one database table. It runs fine on Jdev 3.0.
I used the deploy wizard to create a Jar file. I sent the Jar file to the OAS.
Updated the OAS class path to point to the
Jar file. Invoking the url
host:\vitual\mypkg.myclass gets an error msg
application error has occurred.
A snip from WRB.LOG
`JAVAWEB` 772 0 0x400fff `Exception happend in executing racle.OAS.servlet.servletrunner.main(String[])
Any detailed steps on how to do this is appreciated.
Thanks
Mike -
Can WebLogic do Servlet chaining?
Is there any way to do servlet chaining, or more specifically, for a servlet
to process the output of another servlet efficiently in WebLogic?
Sure a servlet can make another HTTP request on its own servlet engine but
that seems way too inefficient and heavy handed.
J.
James Strachan
=============
email: [email protected]
web: http://www.metastuff.com
Check out XSPs from cocoon. xml.apache.org
-- bk
James Strachan wrote:
> JSP includes go straight to the HTTP response - there's no way of
> redirecting them to some internal buffer for post processing. (*)
>
> I'd like to be able to do simple Servlet chaining at the Servlet or JSP
> level. i.e. I want to post process the output of one servlet / JSP page to
> do things like caching or XSL styling. This is a totally reasonable request
> IMHO. Imagine a complex portal with alot of included JSP files - I'd like to
> be able to cache whole chunks of the page - a chunk may have many includes
> inside it..
>
> Right now there is no way of doing such a thing in WebLogic AFAIK. You have
> to go through every JSP file and add caching / styling to it rather than
> being able to 'pipeline' or 'servlet chain' which is less than ideal. (Also
> remember there is a 64K code size limit on the bytecode that can exist in a
> Java class - so JSP files should be kept small to void hitting this
> barrier). I can't quite believe noone else has hit this problem.
>
> There are workarounds such as to do the include using seperate internal HTTP
> requests, RMI calls or JMS messages all of which seem to be far too
> heavyweight.
>
> (*)
> <aside>
> One side effect of the JSP include always going straight to the response
> means that you can't use WebLogics <cache> tag if you are using any kind of
> JSP include. e.g. the following snippet doesn't work as expected :-
>
> <cache>
> something
> <jsp:include file="foo.jsp" flush="true"/>
> something else
> </cache>
>
> since you are not allowed to do an include inside a body tag,. Even if you
> were the output of the include goes straight to the response, not the body
> tag.
>
> So you have to close and reopen the cache tags around each include which may
> break your XML complience for complex pages and is error prone and much more
> inefficent to boot:-
>
> <cache>
> something
> </cache>
> <jsp:include file="foo.jsp" flush="true"/>
> <cache>
> something else
> </cache>
>
> </aside>
>
> --
> J.
>
> James Strachan
> =============
> email: [email protected]
> web: http://www.metastuff.com
>
> "Jaggu Dada" <[email protected]> wrote in message
> news:[email protected]...
> >
> > Hi --
> >
> > Why not use JSP includes? The only servlet "chaining" that is
> > reasonable
> > is to use a servlet to process an initial request, and make some
> > decision
> > (based on the querystr, for example) then, without having written
> > anything
> > back to the client, do a server side redirect (with RequestDispatcher)
> > to
> > a servlet or JSP that does some work. If I understand you correctly,
> > servlets
> > were not designed to do what you are proposing.
> >
> > Hope this helps,
> >
> > -jaggu
> >
> > James Strachan wrote:
> > >
> > > Hi Joe
> > >
> > > Thanks for that. Sure that would work too though its probably a heavier
> > > weight than just plain old HTTP.
> > >
> > > I was looking for something a little more lightweight such that I could
> > > include 10-20 servlet chains per web page on a complex portal without
> too
> > > much performance hit. i.e. using synchronous servlet 'pipelining' to
> > > generate complex pages without having to do many internal HTTP / JMS
> > > requests.
> > >
> > > --
> > > J.
> > >
> > > James Strachan
> > > =============
> > > email: [email protected]
> > > web: http://www.metastuff.com
> > >
> > > "Joe Trung" <[email protected]> wrote in message
> > > news:[email protected]...
> > > >
> > > > Hi Jim,
> > > > I chain my servlets via jms: the queue is output/input bin
> > > >
> > > > Joe
> > > >
> > > >
> > > > "James Strachan" <[email protected]> wrote:
> > > > >Is there any way to do servlet chaining, or more specifically, for a
> > > servlet
> > > > >to process the output of another servlet efficiently in WebLogic?
> > > > >
> > > > >Sure a servlet can make another HTTP request on its own servlet
> engine
> > > but
> > > > >that seems way too inefficient and heavy handed.
> > > > >
> > > > >J.
> > > > >
> > > > >James Strachan
> > > > >=============
> > > > >email: [email protected]
> > > > >web: http://www.metastuff.com
> > > > >
> > > > >
> > > > >
> > > >
-
Process chains and event collectors
Hi All,
I need help in Process chains and event collectors.I joined in new project and this client using process chains and event collectors and they ask me to work on these areas.I didnt work on this as of now So please send any docs on this area and explain the procedure and technology methods behind this concept.I would really appreciate If someone can send me the full documentation on this concept as I couldnt find any any docs on this.
Thanks,
RasHi Ras,
Process chains are a sequence of processes to be performed. The are put together in a chain with the necessary dependancies (process A needs to finish before B can start) and conditions (if A and B are successful then C else send an email), and then scheduled. They usually revolve around processes related to data loading: Load, activate, roll up, compress etc.
Please take a look at this links/threads for more info:
http://help.sap.com/saphelp_nw04/helpdata/en/8f/c08b3baaa59649e10000000a11402f/content.htm
process chains
process chains
Process chains
https://www.sdn.sap.com/irj/servlet/prt/portal/prtroot/docs/library/uuid/8da0cd90-0201-0010-2d9a-abab69f10045
https://www.sdn.sap.com/irj/servlet/prt/portal/prtroot/docs/library/uuid/19683495-0501-0010-4381-b31db6ece1e9
https://www.sdn.sap.com/irj/servlet/prt/portal/prtroot/docs/library/uuid/36693695-0501-0010-698a-a015c6aac9e1
https://www.sdn.sap.com/irj/servlet/prt/portal/prtroot/docs/library/uuid/9936e790-0201-0010-f185-89d0377639db
https://www.sdn.sap.com/irj/servlet/prt/portal/prtroot/docs/library/uuid/3507aa90-0201-0010-6891-d7df8c4722f7
https://www.sdn.sap.com/irj/servlet/prt/portal/prtroot/docs/library/uuid/263de690-0201-0010-bc9f-b65b3e7ba11c
Process Chains
https://www.sdn.sap.com/irj/servlet/prt/portal/prtroot/docs/library/uuid/8da0cd90-0201-0010-2d9a-abab69f10045
http://help.sap.com/saphelp_nw2004s/helpdata/en/8f/c08b3baaa59649e10000000a11402f/frameset.htm
https://www.sdn.sap.com/irj/servlet/prt/portal/prtroot/docs/library/uuid/19683495-0501-0010-4381-b31db6ece1e9
https://www.sdn.sap.com/irj/servlet/prt/portal/prtroot/docs/library/uuid/36693695-0501-0010-698a-a015c6aac9e1
https://www.sdn.sap.com/irj/servlet/prt/portal/prtroot/docs/library/uuid/9936e790-0201-0010-f185-89d0377639db
https://www.sdn.sap.com/irj/servlet/prt/portal/prtroot/docs/library/uuid/3507aa90-0201-0010-6891-d7df8c4722f7
https://www.sdn.sap.com/irj/servlet/prt/portal/prtroot/docs/library/uuid/263de690-0201-0010-bc9f-b65b3e7ba11c
Re: Ho to make a variable mandatory or optional?
There are lot of threads available in the SDN....go through them.....hope it really helps you in getting understand what are process chains and Events included in that.
Assign points if it helps you.
Regards,
Sreedhar -
Want to run servlets, jsp and ejb
Hi there, I have just installed Oracle9iAS on my PC and want to check how to use OAS with servlets, JSPs and ejbs, so can any one tell me where can I find some worked examples and step by step deployment process, any help would be appreciated, regards, Shabbier
If you have installed Oracle9iAS, you should be able to get to the welcome page at
http://<machine_name>:7778/
You can see J2EE demos off that page.
Also, you can find info about oracle9iAS and documentation
at http://otn.oracle.com/products/ias/content.html
OC4J info can be found at
http://otn.oracle.com/tech/java/oc4j
-Prasad -
Problem with porcess chain and request
Hi experts, I need help!!
Description of the problem:
There are three ODS that load daily information through process chain to a Infocubo. The information is loaded to BW from View generated for a data base of Oracle.
Process chain for the three ODS is equal, I explain the one procedure to them:
1- It initiates the chain .
2- Blocks the View.
3- Load of data through of infoPackage.
4- If the load finishes well, then the data is delete of the View, the data is activte in the ODS and it arises to InfoCube, and if it finishes bad so unblocks the View and finishes process chain.
5- Then, reconstruct the Index.
Happens that in the three ODS when we loaded the information erase request loaded previously, then the historical one of the loads in the ODS does not stay. The three infopakages used to load the information to the ODS does not have labeled the option to erase the destiny of data, and it does not have it marked. The ODS do not have marked that option.
When I do it manually east problem does not happen. I erased process chain and I returned to create the chain porcess but it did not solve the problem.
Thank youI just checked the documentation and found that your code is incorrect. IAlternativeName::StrValue contains value for an email address, a Domain Name System (DNS) name, a URL, a registered object identifier (OID), or a user principal name (UPN). It doesn't
contain string value for directory name (and other non-mentioned types). Instead, you need to instantiate an IX500DistinguishedName interface and initialize it from an alternative name value:
class Program {
static void Main(string[] args) {
String RequestString = "Base64-encoded request");
CX509CertificateRequestPkcs10 request = new CX509CertificateRequestPkcs10();
request.InitializeDecode(RequestString, EncodingType.XCN_CRYPT_STRING_BASE64_ANY);
Console.WriteLine("Subject: {0}", request.Subject.Name);
foreach (IX509Extension ext in request.X509Extensions) {
if (ext.ObjectId.Name == CERTENROLL_OBJECTID.XCN_OID_SUBJECT_ALT_NAME2) {
CX509ExtensionAlternativeNames extensionAlternativeNames = new CX509ExtensionAlternativeNames();
string rawData = ext.RawData[EncodingType.XCN_CRYPT_STRING_BASE64];
extensionAlternativeNames.InitializeDecode(EncodingType.XCN_CRYPT_STRING_BASE64, rawData);
foreach (CAlternativeName alternativeName in extensionAlternativeNames.AlternativeNames) {
switch (alternativeName.Type) {
case AlternativeNameType.XCN_CERT_ALT_NAME_DIRECTORY_NAME:
IX500DistinguishedName DN = new CX500DistinguishedName();
DN.Decode(alternativeName.RawData[EncodingType.XCN_CRYPT_STRING_BASE64]);
Console.WriteLine("SAN: {0}", DN.Name);
break;
default:
Console.WriteLine("SAN: {0}", alternativeName.strValue);
break;
My weblog: en-us.sysadmins.lv
PowerShell PKI Module: pspki.codeplex.com
PowerShell Cmdlet Help Editor pscmdlethelpeditor.codeplex.com
Check out new: SSL Certificate Verifier
Check out new:
PowerShell FCIV tool. -
Hello all,
We are currently having a problem with submitting process chains. When the chain runs it does not display any of steps in the process chain and where they are running from the job we submit (RSI_START_BW_CHAIN) So what we see is that the PC has been submited and completes in say 17seconds...however everything is still running on the system...for example our CIF PC runs for 3hrs but alls we are able to see is that the job executed from CPS with no problem. I've been digging around trying to find clear documenation about PC and CPS with little luck. Am I approaching the job submission incorrectly? We are on 7.0.3.Hello,
You should be using the RSI_RUN_BW_CHAIN job to start process chains. Maybe you can try that first. For the rest things should be straight forward. Depending on the BW backend systems you might encounter some issues, with BW 7 the synchronization has changed and you would be better of using one of the latest 7.0.4 versions (SP6 has just been released).
Regards Gerben -
Hi Gurus
We are designing process chain for our BW solution.
We have identified the dependancies of various loads which includes flat file loads and loads from R/3.
Now we would like to control ERP and BW jobs by Control M .
I have the following questions:
1. Which portion of BW process chain do we need to plug in to Control M. ? Do we need to connect the Meta chain to control M or different process chains to Control M?
2.When we create Process chain for BW, what is the best practice? Is it good to create small chains and connect them via Meta chain? or we have to create small chains only?
3. Can we include R/3 extraction Jobs and V3 Jobs in our BW process chain? so that they will trigger when we run the process chain?
4. If we decide to run V 3 jobs after every half an hour on R3 side then how often we have to extract data to BW? and how to catch correct delta without missing single record?
I would appreciate if you can help me with your knowledge.
Thanks
KrisHi Kris,
Control M is a third party scheduling tool provided by BMC software.
1) You have option to either include a meta chain or individual chains in Control M .
2) It depends on how your data is extracted. If you have any dependecy jobs on sourcre system, then it is better to create small process chians and include them along with your dependecy jobs in Control M
3)you can include R/3 dependency jobs and V3 jobs in Control M. Once those jobs are finished you can run BW process chains via Control M.
4)You have several scheduling options availalbe in control M to take care of your BW and R/3 jobs.
V3 jobs collect data from R/3 application tables and fill R/3 delta queue(RSA7). when you run BW process chain it will collect all the data available from R/3 delta queue.
If both jobs are running at same time, then it is better to apply some wait time on BW jobs(in process chians).
hope this helps -
Process Chain and Info Package Transportation?
Hello All
Case 1
How to transport a Process chain and a Info Package Individually?(<b>first time from DEV to QA</b>)
Case 2
I have 9(<b>IP1 to IP9</b>)infopackages and 3(<b>P1 to P3</b>) Process chains and in each Process chain I have 3 Info packages like IP1,IP2,IP3 in P1 and so on.,
Now is it necessary to transport Infopackages and Process chains individually or is it enough to transport Process chains only(If v transport Process chains will the Infopackages will also transport)then how to assign a request to each one?
What I mean is,all <b>Infopackages are in $TMP(Package</b>),and Process chain is also <b>$TMP</b>,Now is it necessary to change the package for Infopackages as well as Process chain and Tranport only PC,Iam confused here,can anyone help me out?
Many thanks
balajiHi Balaji,
If you have created new infopackages then you have to transport them. It is not necssary to transport the info-packages if you have them in the process chain but have not changed them.
Also you cannot transport any BW object with $ tmp package. Assign a package and then transport the objects.
The best way to transport the process chain is from the RSPC screen. Click the transport button there and collect all the objects and transport them.
Bye
Dinesh -
I always get
Status: Failed - HOST [Macintosh.local] QuickTime file not found.
after the first part of the job is successful.
If I just submit with "This Computer" it works fine. Original file is ProRes 422, first job uses ProRes 422 to scale it down to 480x270. Second job compresses to h.264. I found some info on this board from 2008 saying that job chaining and quickclusters don't work together. Is that still how it is? that's really useless..
I also found this from Jan 2009
David M Brewer said:
The reason the second rendering is failing is.......this has happen to me a few times until I figured it out.....make sure you set the dimension to the video in the h.264 settings, set to the same size as the Pro Rez dimensions.
For the most part the dimensions are left blank for the second link, h.264. And don't use 100% of source. Put the physical numbers into the spaces. When you link one video to another, the second codec doesn't know the specs settings you made for the first video settings.
Also make sure you (at least check) set the audio for the second video. I usually have the Pro Res do the audio conversion and just past-through to the second video settings. Again it can happen the the audio is disable in the h.264 settings. This has happen a few time for me........... Check and double check your settings!
He doesn't mention anything about with or w/o Quickclusters, but I tried what he said and could not get it to work with quickclusters...
Anyone got any new info on this?Studio X,
Thanks for taking the time to run some tests and post your results.
I'm finding the same results with converting ProRes422 to mp4, But...
Other codecs are giving me very different results.
I've run some random tests to try to get a grip on whats happening.
First I was playing around with the # of instances. I've read here and on Barefeats that (at least for my model MacPro) the instances should be set to (# of processors/2), so I've been using 4 for quite a while now and thought I'd test it for myself.
A single 5min ProRes422 1920x1080 29.97 file to h.264
This Computer- 15:28
2 Instances- 14:56
3 Instances- 13:52
4 Instances- 14:48
5 Instances- 13:43
6 Instances- 13:48
7 Instances- 13:58
In this case 5i was the fastest but not using a Quickcluster wasn't far off
A single 2m30s ProRes422 1920x1080 29.97 file to h.264
This Computer- 3:19
2 Instances- 3:45
3 Instances- 3:45
4 Instances- 3:45
5 Instances- 3:50
6 Instances- 4:00
7 Instances- 4:00
Interesting...not using a Quickcluster is fastest
A single 2m30s ProRes422 1920x1080 29.97 file Scaled down using original codec
This Computer- 5:20
4 Instances- 4:10
5 Instances- 4:10
7 Instances- 4:11
A single 1m30s ProRes422 1920x1080 29.97 file to mpeg-2
This Computer- 2:12
5 Instances- 2:10
When Quickclusters are faster, 4-5 instances does seem to be the sweet spot(again for my setup).
In the mpeg-2 test, I should have used a longer clip to get a better test but it was getting late and I was just tring to get an idea of the codecs usage of my resources. I was also monitoring CPU usage with Activity Monitor in all tests.
Now multiclip batches:
I forgot to write down the length of the clips in this first test but it consisted of 8 ProRes 422 clips. 3 about 1m long and the rest between 13s and 30s
8 ProRes 422 clips to mp4
This Computer- 11:25
4 Instances- 5:16
Same results as Studio X
Next tests with 5 clips(total 1m51s)
5 ProRes 422 clips to h.264
This Computer- 5:00
4 Instances- 4:52
5 ProRes 422 clips to mpeg-2
This Computer- 2:55
4 Instances- 3:01
5 ProRes 422 clips to DV NTSC
This Computer- 6:40
4 Instances- 5:12
5 ProRes 422 clips to Photo Jpeg
This Computer- 2:44
4 Instances- 2:46
I re-ran the last test with 7 clips because of the time it took reassemble the segmented clips
7 ProRes 422 clips to Photo Jpeg(total 3m14s)
This Computer- 4:43
4 Instances- 3:41
One last test,
A single ProRes 422 clip to Photo Jpeg(4:05;23)
This Computer- 5:52
4 Instances- 4:10
Let me start off by saying it is clear that there are many factors that effect compression times such as # of clips, length of clips, and codecs, but here are some of the things I noted:
1)Some codecs themselves seem to be "more aware" of the computers resources than others.
When I compress to h.264 w/o a cluster it will use about 80-85% of all resources
When I compress to h.264 with a cluster it will use about 90-95% of all resources
When I compress to PhotoJpeg w/o a cluster it will use about 20-25% of all resources
When I compress to PhotoJpeg with a cluster it will use about 80-85% of all resources
2)The time it takes to reassemble clips can be quite long and could effect overall speed
In the very last test, compressing a single file to photoJpeg using 4 instances took 4m10s. Watching Batch Monitor I noted that it took 2m0s to compress and 2m10s to reassemble.Wow...
It would be interesting to see how the disassemble/reassemble of bigger and larger batches using clusters effect overall time. But that would take some time.
I think the thing I will be taking with me from all of this is your workflow is your own. If you want to optimize it, you should inspect it, test it and adjust it where it needs adjusting. Now if anyone has the time and was to run similar tests with very different results I'd love to know about it... -
Hi recently my MacBook pro has stopped connecting to our office Synology server automatically. I now have to Go > Connect to Server > etc in order to browse the file server. Clearing key chain and then readding everything hasnt made any difference. HELP
Not sure what other tests they could run for me. I've pretty much run all the tests I can using Drive Genius and Tech Tool Pro 5. Is there anything they use that I don't know about??
-
To find only master data process chains and its infopacakges!!!!
Hi,
I have a task to remove the setting in the scheduler of directly assigning to data targets instead of PSA and then to data targets of the whole system!!!
i should not touch though the transactional info packages settings!!
these infopackages must exist in process chains -
how to go about this??Hi,
If I have understand your point, then you want to change the setting of the infopackages that includes in the Process chain pertaining to the master date.
Then you can change the settings directly in the processing tab.
Or if you don't want to change the settings of the IP s that are in the Process chain,then go to RSA1 >> Select those IP s >> rigt click and copy it.Then you can change the settings there.
You can run each IP manually.Or if you want the same chain just with the changed settings,then copy the chain and put this new IP in place of the old one and schedule the chain.
If your doubts are cleared then kindly assign me some points.
Regards,
Debjani.. -
Chain and store data not populating properly in BPARTNER
Hi all,
There is a field 0bpartner2 in the data source name 0BP_RELATIONS_ATTR. This field is also there in DSO1 which fetches the data from the mentioned datasource. Then using routines this field 0bpartner2 splits into two-->chain and store which are getting populated in the next level of DSO, DSO2 which finally goes into 0bpartner infoobject.
But for two particular bp nos the values of chain and store are not getting populated.
Checked in RSA3, the data is OK, but in the PSA level the data is not there.
This is happening only for this two particular bp nos.
Kindly help.Hi,
at info package data selection don't have selection please check the below options.
1. go to ECC -> enter the t code- RSA5 -> select the data source -> go to change mode -> select the check boxes possibility.
2. if not there come to BI system select the data source go to change mode -> select selection menu -> browse and choose the X- selection possible.
now it available selections at info package - while seeing the RSA3 data give the selection at info package
Thanks,
Phani.
Maybe you are looking for
-
Error while applying patch MLR#4
hi, I am getting error while applying patch MLR#4. Can you please let me know whats error can be resolved? Please see below the snapshots from log file. Thanks, Vaibhav Execute perf_wid_pid_pcid.sql Error: Abort transaction java.lang.NullPointerExcep
-
For some reason my iPod is no longer recognised by iTunes on the computer I have always used! I have been going round in circles and going fairly demented trying to fix this. I just want to sych a couple of new playlists I made on to my iPod but it
-
Automatic Payment Run-f110 Reg
Hi, I am not FICO Guy.But,I need to run the Automatic payment Run -F110. Can you please explain step by step procedure how to run the F110. Thanks in advance.
-
Problem with song tracks order
I seem to have this problem with only one album. Even though each song in the album has been correctly identified by its track number it still shows up in iTunes in the wrong order. This is especially annoying because the album has gapless playback a
-
My nano ipod 7 gen will not be recognized by itunes but my classic ipod syncs just fine
my nano ipod 7 generation will not sync or be recognized. it says it detects an ipod but cannot be recognized. i have an ipod classic that syncs just fine. why will my nano not be recognized? it is a new computer and i have done everything it has