In OSB , xquery issue with large volume data

Hi ,
I am facing one problem in xquery transformation in OSB.
There is one xquery transformation where I am comparing all the records and if there are similar records i am clubbing them under same first node.
Here i am reading the input file from the ftp process. This is perfectly working for the small size input data. When there is large input data then also its working , but its taking huge amount of time and the file is moving to error directory and i see the duplicate records created for the same input data. I am not seeing anything in the error log or normal log related to this file.
How to check what is exactly causing the issue here,  why it is moving to error directory and why i am getting duplicate data for large input( approx 1GB).
My Xquery is something like below.
<InputParameters>
                for $choice in $inputParameters1/choice              
                 let $withSamePrimaryID := ($inputParameters1/choice[PRIMARYID eq $choice/PRIMARYID])                
                 let $withSamePrimaryID8 := ($inputParameters1/choice[FIRSTNAME eq $choice/FIRSTNAME])
                 return
                  <choice>
                 if(data($withSamePrimaryID[1]/ClaimID) = data($withSamePrimaryID8[1]/ClaimID)) then
                 let $claimID:= $withSamePrimaryID[1]/ClaimID
                 return
                 <ClaimID>{$claimID}</ClaimID>                
                 else
                 <ClaimID>{ data($choice/ClaimID) }</ClaimID>

HI ,
I understand your use case is
a) read the file ( from ftp location.. txt file hopefully)
b) process the file ( your x query .. although will not get into details)
c) what to do with the file ( send it backend system via Business Service?)
Also noted the files with large size take long time to be processed . This depends on the memory/heap assigned to your JVM.
Can say that is expected behaviour.
the other point of file being moved to error dir etc - this could be the error handler doing the job ( if you one)
if no error handlers - look at the timeout and error condition scenarios on your service.
HTH

Similar Messages

  • Issue with Ord Start Date

    Hi,
    I am having issue with Ord Start Date.
    When i create XYZ Asset now when i do the acquasation for that new asset. Example: Using FB01 i enter todays date and i do the posting.
    Now when i go to DEP Area TAB for the DEP Key (M200) - The Ord Dep Start Date is 15 Dec 2010.
    For this DEP Key Period Control Methods (T code - AFAMP) - Acquisation Entered is ( 6 that is - at the start of year) and the Period Control is ( T code - OAVS) (4 that is - First year convention at half year start date).
    Our client is using the fiscal year as 1 June 2010 to 31 May 2011.
    So when i do acquistion for the asset as per the period contol the Ord Dep Start Date should be 1 Dec 2010, but here it is talking 15 Dec 2010.
    Why is system talking 15 days more.
    Regards.

    Hi
    Which Fiscal year variant are you using??
    Did you maintain it in OAVH? Copy the existing entry in OAVH for K4 to your Fisc Yr var and try again
    And also, 04 does not figure for Acquisitions in AFAMP under 0004.... Whatever you specify in OAVH must also be a part of AFAMP
    Regards
    Ajay M
    Edited by: Ajay Maheshwari on Oct 28, 2010 2:31 PM

  • Issue with list saving data after sites upgrade from sharepoint 2010 to sharepoint 2013

    Issue with list saving data after sites upgrade from sharepoint 2010 to sharepoint 2013 
    Newform.aspx of list:-
    Custom List is not saving data sometimes in the new form after 15 minutes and only blank entry record got created without saving data, even though some columns are mandatory fields?

    Hello dcakumar,
    Sounds like a strang issue. If you can reproduce this can you see some errors in the ULS logs?
    - Dennis | Netherlands | Blog |
    Twitter

  • Oracle RDF / Joseki : issue with large literals

    Hi,
    I have been using Joseki to query an Oracle RDF model. There seems to be an issue with large literals (according to a few unreliable tests, I would say this concerns literals around and over 4000 chars).
    Here are the two potential behaviours :
    First case:
    If the results contains several lines, one of which contains an overly large literal, there are NO exception on the server side, but the resulting xml is incomplete.
    It misses the "line" containing the large literal, and the xml is stopped there (which means that it also misses the closing </results> and </sparql>. In my case, I am using the results through Jena's sparqlService, which means I get this message :
    XMLStreamException: Unexpected EOF; was expecting a close tag for element <results>
    +at [row,col {unknown-source}]: [31,0]+
    Second case:
    If the query only returns one line which contains an overly large literal, the client receives a simple *"HttpException: 500 Internal Server Error"*
    Here is the error message from my server :
    +INFO [[ACTIVE] ExecuteThread: '0' for queue: 'weblogic.kernel.Default (self-tuning)'] (SPARQL.java:165) - Throwable: we+
    blogic.jdbc.wrapper.Clob_oracle_sql_CLOB cannot be cast to oracle.sql.CLOB
    java.lang.ClassCastException: weblogic.jdbc.wrapper.Clob_oracle_sql_CLOB cannot be cast to oracle.sql.CLOB
    at oracle.spatial.rdf.client.jena.OracleSemIterator.getNodesFromResultSet(OracleSemIterator.java:579)
    at oracle.spatial.rdf.client.jena.OracleSemIterator.next(OracleSemIterator.java:445)
    at oracle.spatial.rdf.client.jena.OracleLeanQueryIter.moveToNextBinding(OracleLeanQueryIter.java:135)
    at com.hp.hpl.jena.sparql.engine.iterator.QueryIteratorBase.nextBinding(QueryIteratorBase.java:98)
    at com.hp.hpl.jena.sparql.engine.iterator.QueryIterConvert.moveToNextBinding(QueryIterConvert.java:56)
    at com.hp.hpl.jena.sparql.engine.iterator.QueryIteratorBase.nextBinding(QueryIteratorBase.java:98)
    at com.hp.hpl.jena.sparql.engine.iterator.QueryIterRepeatApply.moveToNextBinding(QueryIterRepeatApply.java:76)
    at com.hp.hpl.jena.sparql.engine.iterator.QueryIteratorBase.nextBinding(QueryIteratorBase.java:98)
    at com.hp.hpl.jena.sparql.engine.iterator.QueryIterProcessBinding.hasNextBinding(QueryIterProcessBinding.java:54
    +)+
    at com.hp.hpl.jena.sparql.engine.iterator.QueryIteratorBase.hasNext(QueryIteratorBase.java:69)
    at com.hp.hpl.jena.sparql.engine.iterator.QueryIterConvert.hasNextBinding(QueryIterConvert.java:50)
    at com.hp.hpl.jena.sparql.engine.iterator.QueryIteratorBase.hasNext(QueryIteratorBase.java:69)
    at com.hp.hpl.jena.sparql.engine.iterator.QueryIteratorWrapper.hasNextBinding(QueryIteratorWrapper.java:30)
    at com.hp.hpl.jena.sparql.engine.iterator.QueryIteratorBase.hasNext(QueryIteratorBase.java:69)
    at com.hp.hpl.jena.sparql.engine.iterator.QueryIteratorWrapper.hasNextBinding(QueryIteratorWrapper.java:30)
    at com.hp.hpl.jena.sparql.engine.iterator.QueryIteratorBase.hasNext(QueryIteratorBase.java:69)
    at com.hp.hpl.jena.sparql.engine.ResultSetStream.hasNext(ResultSetStream.java:62)
    at org.joseki.processors.SPARQL.executeQuery(SPARQL.java:309)
    at org.joseki.processors.SPARQL.execQueryWorker(SPARQL.java:288)
    at org.joseki.processors.SPARQL.execQueryProtected(SPARQL.java:126)
    at org.joseki.processors.SPARQL.execOperation(SPARQL.java:120)
    at org.joseki.processors.ProcessorBase.exec(ProcessorBase.java:112)
    at org.joseki.ServiceRequest.exec(ServiceRequest.java:36)
    at org.joseki.Dispatcher.dispatch(Dispatcher.java:59)
    at org.joseki.http.Servlet.doCommon(Servlet.java:177)
    at org.joseki.http.Servlet.doGet(Servlet.java:138)
    at javax.servlet.http.HttpServlet.service(HttpServlet.java:707)
    at javax.servlet.http.HttpServlet.service(HttpServlet.java:820)
    at weblogic.servlet.internal.StubSecurityHelper$ServletServiceAction.run(StubSecurityHelper.java:227)
    at weblogic.servlet.internal.StubSecurityHelper.invokeServlet(StubSecurityHelper.java:125)
    at weblogic.servlet.internal.ServletStubImpl.execute(ServletStubImpl.java:292)
    at weblogic.servlet.internal.ServletStubImpl.execute(ServletStubImpl.java:175)
    at weblogic.servlet.internal.WebAppServletContext$ServletInvocationAction.run(WebAppServletContext.java:3594)
    at weblogic.security.acl.internal.AuthenticatedSubject.doAs(AuthenticatedSubject.java:321)
    at weblogic.security.service.SecurityManager.runAs(SecurityManager.java:121)
    at weblogic.servlet.internal.WebAppServletContext.securedExecute(WebAppServletContext.java:2202)
    at weblogic.servlet.internal.WebAppServletContext.execute(WebAppServletContext.java:2108)
    at weblogic.servlet.internal.ServletRequestImpl.run(ServletRequestImpl.java:1432)
    at weblogic.work.ExecuteThread.execute(ExecuteThread.java:201)
    at weblogic.work.ExecuteThread.run(ExecuteThread.java:173)
    Would there be any fix / workaround ?
    Please let me know if you need further information / tests.
    Thanks,
    Regards,
    Julien

    Thanks for your reply.
    While trying to build up a small test case, I found out why there were discrepancies between the two cases I described.
    Indeed, usually, the two cases return the same thing (no exception on the server side, but incomplete resulting XML).
    The exception I described happened when I tried something else. Since I saw that issues were coming from long literals, I used fn:string-length (ARQ) to figure out how long they were.
    The test case resulting in the CLOB-cast-exception is:
    - too large literal
    - only one result "line" containing this literal
    - usage of fn:string-length (which does not change the behaviour in other cases (i.e. no long literals or/and several lines).
    Anyway, you will receive the other test cases shortly.
    Thanks,
    Regards,
    Julien

  • Hi All.  My iPad 4 just started having issues with the volume.  On either Bluetooth or the iPad Speaker, anything I'm watching or listening to drops the sound to just barely hearing it and then, about 2 seconds later, it comes back.

    Hi All,
    I tried to search through the community, but with the sheer volume of posts, I didn't find anything that seemed like the same issue.
    My iPad 4 just started having issues with the volume.  On either Bluetooth or the iPad Speaker, anything I'm watching or listening to drops the sound to just barely hearing it and then, about 2 seconds later, it comes back.
    I haven't tried wired ear phones, but I am sure they'll be the same way.
    It's very random, but consistent enough to happen several times per day.
    Thanks for any help that anyone can provide.
    I have restarted it and also did a Hard Reboot and it didn't help.
    Thanks Again,
    Scott P.

    Well there you go, when you advertise it just be sure to say that you can't even see the scratches at half brightness and see how it goes.
    Of course you are saying you want $500 for one with many scratches on the screen, has been dropped, but the scratches aren't visible at half brightness, and has no warranty. Or I can buy a refurbished iPad 4 WiFi 32 GB directly from Apple with no scratches and a one year warranty for $499. http://store.apple.com/us/product/FD514LL/A/refurbished-ipad-with-retina-display -wi-fi-32gb-white-4th-generation
    Why would I buy yours?
    I wouldn't pay more than $300 for a used model with your specs in perfect condition. But yours is far from perfect. I think your high point is $250 which is why I said 50%.

  • My iPod touch is having a little issue with the volume. When I plug in my headphones the volume works fine. When there aren't any headphones plugged in, no sound comes out from my iPod. When I go to turn up the volume, it still says headphones. Any tips?

    My iPod touch is having a little issue with the volume. When I plug in my headphones the volume works fine. When there aren't any headphones plugged in, no sound comes out from my iPod. When I go to turn up the volume, it still says headphones. I tried restarting it a few times, nothing worked. I looked up many tutorials, and still nothing did the trick. Any tips?

    - Try cleaning out/blowing out the headphone jack. Try inserting/removing the plug a dozen times or so.
    Try the following to rule out a software problem
    - Reset the iPod. Nothing will be lost
    Reset iPod touch: Hold down the On/Off button and the Home button at the same time for at
    least ten seconds, until the Apple logo appears.
    - Reset all settings
    Go to Settings > General > Reset and tap Reset All Settings.
    All your preferences and settings are reset. Information (such as contacts and calendars) and media (such as songs and videos) aren’t affected.
    - Restore from backup
    - Restore to factory settings/new iPod.
    - Make an appointment at the Genius Bar of an Apple store. Seems you have a bad headphone jack.
    Apple Retail Store - Genius Bar
    Apple will exchange your iPod for a refurbished one for this price. They do not fix yours.
    Apple - iPod Repair price                  
    A third-party place like the following will replace the jack for less. Google for more.
    iPhone Repair, Service & Parts: iPod Touch, iPad, MacBook Pro Screens
    Replace the jack yourself
    iPod Touch Repair – iFixit

  • Large Volume Data Merge Issues with Indesign CS5

    I recently started using Indesign CS5 to design a marketing mail piece which requires merging data from Microsoft Excel.  With small tasks up to 100 pieces I do not have any issues.  However I need to merge 2,000-5,000 pieces of data through Indesign on a daily basis and my current lap top was not able to handle merging more than 250 pieces at a time, and the process of merging 250 pieces takes up to 30-45 mins if I get lucky and software does not crash.
    To solve this issue, I purchased a Desktop with a second generation Core i7 processor and 8GB of memerory thinking that this would solve my problem.  I tried to merge 1,000 piece of data with this new computer, and I was forced to restart Adobe Indesign after 45 mins of no results.  I then merged 500 pieces and the task was completed, but the process took a little less than 30 minutes.
    I need some help with this issue because I can not seem to find another software that can design my Mail Piece the way Indesign can, yet the time it takes to merge large volumes of data is very frustrating as the software does crash from time to time after waiting a good 30-45 mins for completion.
    Any feedback is greatly appreciated.
    Thank you!

    Operating System is Windows 7
    I do not know what you mean by Patched to 7.0.4
    I do not have a crash report, the software just freezes and I have to do a force close on the program.
    Thank you for your time...

  • Dealing with large volumes of data

    Background:
    I recently "inherited" support for our company's "data mining" group, which amounts to a number of semi-technical people who have received introductory level training in writing SQL queries and been turned loose with SQL Server Management
    Studio to develop and run queries to "mine" several databases that have been created for their use.  The database design (if you can call it that) is absolutely horrible.  All of the data, which we receive at defined intervals from our
    clients, is typically dumped into a single table consisting of 200+ varchar(x) fields.  There are no indexes or primary keys on the tables in these databases, and the tables in each database contain several hundred million rows (for example one table
    contains 650 million rows of data and takes up a little over 1 TB of disk space, and we receive weekly feeds from our client which adds another 300,000 rows of data).
    Needless to say, query performance is terrible, since every query ends up being a table scan of 650 million rows of data.  I have been asked to "fix" the problems.
    My experience is primarily in applications development.  I know enough about SQL Server to perform some basic performance tuning and write reasonably efficient queries; however, I'm not accustomed to having to completely overhaul such a poor design
    with such a large volume of data.  We have already tried to add an identity column and set it up as a primary key, but the server ran out of disk space while trying to implement the change.
    I'm looking for any recommendations on how best to implement changes to the table(s) housing such a large volume of data.  In the short term, I'm going to need to be able to perform a certain amount of data analysis so I can determine the proper data
    types for fields (and whether any existing data would cause a problem when trying to convert the data to the new data type), so I'll need to know what can be done to make it possible to perform such analysis without the process consuming entire days to analyze
    the data in one or two fields.
    I'm looking for reference materials / information on how to deal with the issues, particularly when a large volumn of data is involved.  I'm also looking for information on how to load large volumes of data to the database (current processing of a typical
    data file takes 10-12 hours to load 300,000 records).  Any guidance that can be provided is appreciated.  If more specific information is needed, I'll be happy to try to answer any questions you might have about my situation.

    I don't think you will find a single magic bullet to solve all the issues.  The main point is that there will be no shortcut for major schema and index changes.  You will need at least 120% free space to create a clustered index and facilitate
    major schema changes.
    I suggest an incremental approach to address you biggest pain points.  You mention it takes 10-12 hours to load 300,000 rows, which suggests there may be queries involved in the process which require full scans of the 650 million row table.  Perhaps
    some indexes targeted at improving that process is a good first step.
    What SQL Server version and edition are you using?  You'll have more options with Enterprise (partitioning, row/page compression). 
    Regarding the data types, I would take a best guess at the proper types and run a query with TRY_CONVERT (assuming SQL 2012) to determine counts of rows that conform or not for each column.  Then create a new table (using SELECT INTO) that has strongly
    typed columns for those columns that are not problematic, plus the others that cannot easily be converted, and then drop the old table and rename the new one.  You can follow up later to address columns data corrections and/or transformations. 
    Dan Guzman, SQL Server MVP, http://www.dbdelta.com

  • Flex file upload issue with large image files

         Hello, I have created a sample flex application to upload an image and also created java servlet to upload and save image and deployed in local tomcat server. I am testing the application in LAN. I am able to upload small as well as large image file(1Mb) from some PCs but in some other PCs I am getting IOError while uploading large image files however it is working fine for small images. Image uploading is hanging after 10%-20% and throwing IOError. *Surprizgly it is working Ok with XP systems and causeing issues with Windows7 systems*.
    Plz give me any idea to get a solution.
    In Tomcat server side it is giving following error:
    request: org.apache.catalina.connector.RequestFacade@c19694
    org.apache.commons.fileupload.FileUploadBase$IOFileUploadException: Processing of multipart/form-data request failed. Stream ended unexpectedly
            at org.apache.commons.fileupload.FileUploadBase.parseRequest(FileUploadBase.java:371)
            at org.apache.commons.fileupload.servlet.ServletFileUpload.parseRequest(ServletFileUpload.ja va:126)
            at flex.servlets.UploadImage.doPost(UploadImage.java:47)
            at javax.servlet.http.HttpServlet.service(HttpServlet.java:637)
            at javax.servlet.http.HttpServlet.service(HttpServlet.java:717)
            at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.j ava:290)
            at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:206)
            at org.apache.catalina.core.StandardWrapperValve.invoke(StandardWrapperValve.java:233)
            at org.apache.catalina.core.StandardContextValve.invoke(StandardContextValve.java:191)
            at org.apache.catalina.core.StandardHostValve.invoke(StandardHostValve.java:127)
            at org.apache.catalina.valves.ErrorReportValve.invoke(ErrorReportValve.java:102)
            at org.apache.catalina.core.StandardEngineValve.invoke(StandardEngineValve.java:109)
            at org.apache.catalina.connector.CoyoteAdapter.service(CoyoteAdapter.java:293)
            at org.apache.coyote.http11.Http11AprProcessor.process(Http11AprProcessor.java:877)
            at org.apache.coyote.http11.Http11AprProtocol$Http11ConnectionHandler.process(Http11AprProto col.java:594)
            at org.apache.tomcat.util.net.AprEndpoint$Worker.run(AprEndpoint.java:1675)
            at java.lang.Thread.run(Thread.java:722)
    Caused by: org.apache.commons.fileupload.MultipartStream$MalformedStreamException: Stream ended unexpectedly
            at org.apache.commons.fileupload.MultipartStream$ItemInputStream.makeAvailable(MultipartStre am.java:982)
            at org.apache.commons.fileupload.MultipartStream$ItemInputStream.read(MultipartStream.java:8 86)
            at java.io.InputStream.read(InputStream.java:101)
            at org.apache.commons.fileupload.util.Streams.copy(Streams.java:96)
            at org.apache.commons.fileupload.util.Streams.copy(Streams.java:66)
            at org.apache.commons.fileupload.FileUploadBase.parseRequest(FileUploadBase.java:366)
    UploadImage.java:
    package flex.servlets;
    import java.io.*;
    import java.sql.*;
    import java.util.*;
    import java.text.*;
    import java.util.regex.*;
    import org.apache.commons.fileupload.servlet.ServletFileUpload;
    import org.apache.commons.fileupload.disk.DiskFileItemFactory;
    import org.apache.commons.fileupload.*;
    import sun.reflect.ReflectionFactory.GetReflectionFactoryAction;
    import javax.servlet.*;
    import javax.servlet.http.*;
    public class UploadImage extends HttpServlet{
             * @see HttpServlet#doGet(HttpServletRequest request, HttpServletResponse
             *      response)
            protected void doGet(HttpServletRequest request,
                            HttpServletResponse response) throws ServletException, IOException {
                    // TODO Auto-generated method stub
                    doPost(request, response);
            public void doPost(HttpServletRequest request,
                            HttpServletResponse response)
            throws ServletException, IOException {
                    PrintWriter out = response.getWriter();
                    boolean isMultipart = ServletFileUpload.isMultipartContent(
                                    request);
                    System.out.println("request: "+request);
                    if (!isMultipart) {
                            System.out.println("File Not Uploaded");
                    } else {
                            FileItemFactory factory = new DiskFileItemFactory();
                            ServletFileUpload upload = new ServletFileUpload(factory);
                            List items = null;
                            try {
                                    items = upload.parseRequest(request);
                                    System.out.println("items: "+items);
                            } catch (FileUploadException e) {
                                    e.printStackTrace();
                            Iterator itr = items.iterator();
                            while (itr.hasNext()) {
                                    FileItem item = (FileItem) itr.next();
                                    if (item.isFormField()){
                                            String name = item.getFieldName();
                                            System.out.println("name: "+name);
                                            String value = item.getString();
                                            System.out.println("value: "+value);
                                    } else {
                                            try {
                                                    String itemName = item.getName();
                                                    Random generator = new Random();
                                                    int r = Math.abs(generator.nextInt());
                                                    String reg = "[.*]";
                                                    String replacingtext = "";
                                                    System.out.println("Text before replacing is:-" +
                                                                    itemName);
                                                    Pattern pattern = Pattern.compile(reg);
                                                    Matcher matcher = pattern.matcher(itemName);
                                                    StringBuffer buffer = new StringBuffer();
                                                    while (matcher.find()) {
                                                            matcher.appendReplacement(buffer, replacingtext);
                                                    int IndexOf = itemName.indexOf(".");
                                                    String domainName = itemName.substring(IndexOf);
                                                    System.out.println("domainName: "+domainName);
                                                    String finalimage = buffer.toString()+"_"+r+domainName;
                                                    System.out.println("Final Image==="+finalimage);
                                                    File savedFile = new File(getServletContext().getRealPath("assets/images/")+"/LowesFloorPlan.png");
                                                    //File savedFile = new File("D:/apache-tomcat-6.0.35/webapps/ROOT/example/"+"\\test.jpeg");
                                                    item.write(savedFile);
                                                    out.println("<html>");
                                                    out.println("<body>");
                                                    out.println("<table><tr><td>");
                                                    out.println("");
                                                    out.println("</td></tr></table>");
                                                    try {
                                                            out.println("image inserted successfully");
                                                            out.println("</body>");
                                                            out.println("</html>");  
                                                    } catch (Exception e) {
                                                            System.out.println(e.getMessage());
                                                    } finally {
                                            } catch (Exception e) {
                                                    e.printStackTrace();

    It is only coming in Windows 7 systems and the root of this problem is SSL certificate.
    Workaround for this:
    Open application in IE and click on certificate error link at address bar . Click install certificate and you are done..
    happy programming.
    Thanks
    DevSachin

  • Having issues with music volume after syncing with iCloud, Having issues with music volume after syncing with iCloud

    Has anyone had issues with their music volume after syncing with iCloud ?

    There have been several posts here and on the iphone site about this.
    The jury is still out what the fix will be. Some posts say it's a problem with the metadata and the orientation data contained there-in. I can not confirm that.
    I do know that when I take a landscape screen snap, the orientation is wrong when I email or upload the photo to my computer.

  • Having issues with System Volume Information

    Hi All.
    I hjave a Ps script that I use to wipe off data from SQL servers as aprt of a decom process.
    The script is as follows.
    get-childitem F:\ -include *.ldf,*.mdf,*.ndf,*.bak,*.trc,*.txt,*error*,*sqlagent*,*system_heal*,*xel*,*bak*,*logs*,*.mdmp*,*cer* -exclude *System* -recurse | foreach ($_) {remove-item $_.fullname -Force }
    At the moment, it runs into issues with a folder. I would either like to delete the folder as well or actually exclude it. Also as an improcement to the script, I would want it to display the list of files it has deleted.
    The error that I get is as follows.
    remove-item : Access to the path 'F:\System Volume Information' is denied.
    At line:1 char:155
    + ...  foreach ($_) {remove-item $_.fullname -Force   }
    +                    ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
        + CategoryInfo          : PermissionDenied: (F:\:String) [Remove-Item], UnauthorizedAccessException
        + FullyQualifiedErrorId : RemoveItemUnauthorizedAccessError,Microsoft.PowerShell.Commands.RemoveItemCommand
    Confirm
    The item at F:\ has children and the Recurse parameter was not specified. If you continue, all children
    will be removed with the item. Are you sure you want to continue?
    [Y] Yes  [A] Yes to All  [N] No  [L] No to All  [S] Suspend  [?] Help (default is "Y"):
    Thanks in advance.

    Hi MrFlinstone,
    I’m writing to just check in to see if the suggestions were helpful. If you need further help, please feel free to reply this post directly so we will be notified to follow it up.
    If you have any feedback on our support, please click here.
    Best Regards,
    Anna Wang
    TechNet Community Support

  • Issue with 0hrposition master data

    We are extracting data from SAP using 0HRPOSITION_ATTR datasource. I noticed that data is not maintained correctly in master data tables in BW and giving us incorrect results in reporting. Consider below mentioned scenario:
    Position A created as vacant on 04/01/2006 with start date (BEGDA/ Valid from) as 04/01/2006 and End date (ENDDA/ Valid to) as 12/31/9999 and is vacant. Below mentioned entry is shown under maintain master data for 0HRPOSITION in BW:
    Position Valid To Valid From Position Vacant
    A 03/31/2006 01/01/1000
    A 12/31/9999 04/01/2006 X
    Position A is now delimited on 09/15/2006 as it’s no more required. In SAP, position has record only from 04/01/2006 till 09/15/2006 as vacant. When record is extracted in BW, it creates below mentioned entry in master data table.
    Position Valid To Valid From Position Vacant
    A 03/31/2006 01/01/1000
    A 09/15/2006 04/01/2006 X
    <b>A 12/31/9999 09/16/2006 X</b>
    Entry 09/16- 12/31 is incorrect as position doesn’t exist for this duration. If we report on 0HRPOSTION with key date as 09/30/2006, it shows position A as vacant though position no longer exists.
    Has anyone come across this situation. any help is greatly appreciated.
    Kamal
    P.S: Milind Rane...I was searching through the forums and came across your post.I would appreciate if you could let me know how you solved this issue...
    Message was edited by:
            Kamal K

    HI KK
    I have a similar issue. Can you please let me know how this Issue with 0HRPOSITION_ATTR extractor was resolved. In my case i have incorrect data in BW when reporting is done. THrough the extractor untill the PSA i have the correct records coming  but in the master data 0HRPOSITION i have incorrect records.
    Please help.
    Thanks
    Hari
    Message was edited by:
            SAPCOOL

  • Is there an issue with large libraries on iPod Classic 160 GB?

    I got a new iPod Classic a week ago. Have already taken it back and exchanged it once. The issue is that it works fine when I load , say, 20,000 songs onto it, but when I try to sync my whole library of around 31,500 songs it no longer works. All I get after the 'OK to disconnect' message has come up is the Apple logo. Attempting to reset doesn't work; reconnecting to the computer I get the message (some of the time) that iTunes has detected an iPod in recovery mode.
    I can restore the iPod by putting into disk mode, but restoring doesn't solve the problem - it comes back again if I try to sync the full library.
    So, I currently have a new iPod Classic with a big hard drive that is supposed to hold up to 40,000 songs according to specifications and what I was told by the expert in the Apple Store, but so far the behaviour of the two iPods has indicated that in reality it can't handle that many.
    THis is not the first time I've seen this kind of issue with an iPod. I had exactly the same issue when the first 60 GB iPods came out - software only worked with a smaller number of songs.

    I've been able fill up a 160Gb Classic quite happily, although at some point I recall finding my iPod started to crash as I got it filled up. One problem seemed to be that iTunes & Windows would get confused when trying to add a really large number for which this workaround helps during the rebuild.
    *Break up large transfers*
    In iTunes select the menu item *File... New Smart Playlist*. Change the first drop-down box to Playlist, the next to is and the next to Music or whatever playlist holds the bulk of the content you want on your device. Tick against *Limit to*, type in say 10, then change the drop-down to GB, and set the last drop-down to artist. When you click OK you can enter a name for the playlist, e.g. Transfer. Now sync this playlist to your iPod rather than your entire library. When the sync is complete modify the rule ( *File... Edit playlist* ) to increase the size by your chosen amount, then sync and repeat. You can experiment with different size increments, if it doesn't work just choose something a bit smaller until it works each time. Before long you should have all your music on your iPod. Once that's done you can move on to other media such as podcasts, videos, photos, playlists etc.
    The other problem I found is that sometimes the iPod would crash on exiting or even on entry to an iPod game. After a number of restores I determined that the number and complexity of the playlists I was putting on was the cause of the problem. I believe these are loaded into working RAM, so use too much memory for these and the iPod struggles on any other memory hungry task.
    tt2

  • Issue with the Posting Date of the Purchase Order.

    Hi All,
    There are fields in BW like SSL1: Time OK, SSL2: Qty OK, SSL3: Time & Qty Ok, SSL4: Days Late (Routines are written to calculate). These fields will indicate whether the delivery against a GR is OK or not with respect to Time, Quantity and the No. of Days..
    But here the issue I am facing is
    If there is only1 delivery/ GR against a single item the calculation in BW are correct - i.e. for a particular PO if there is only one delivery the above fields like SSL1: Time Ok, SSL2: Qty OK will show like the delivery is done within the specified time and everything is OK (in case if it is delivered within the allotted time)
    But if there are multiple deliveries or multiple GR's  posted for one PO item, the calculations are going wrong i.e. even if the delivery is done well within the specified time it is showing the wrong calculations like it is delievered too late. Because in this case the earlier dates are overwritten.
    Can anyone throw me some light on how can I go about solving this issue.
    I am thinking of declaring the Posting Date as the KeyField of the DSO as of now it is a Data field  I also want to know the impact of assigning this as a Keyfield.
    Thanks in advance,
    Prasapbi

    Hi,
    As I understand, you have a DSO based on Purchase Order and your key field is PO and its line item. The problem as you stated will always be there if you have multiple deliveries/GRs created for a single line item because the system will overwrite the entries for same key.
    Problem with adding Posting date as keyfield will be that then your key will be PO-PO Lineitem-Date. When PO will be created, the Posting date will be blank(correct me here if I am wrong), therefore you will have one entry for same PO-line item combination. One without date and other with date, which again would be incorrect. If my assumption about Posting date was wrong, even then your data may not be correct because then you may have many entries with same posting date which again would overwrite each other.
    If there is any direct link between PO line item and number of deliveries that will get created for them, then you can bring that field in DSO as keyfield. But I don't think there is any such field.
    Looking at your report requirement, I would suggest that you make a DSO based on Goods Receipts and then calculate these keyfigures by comparing the dates between GR posting date and PO line item date.
    Else you can change the way your datasource works(if its generic one based on function module). Since your main requirement is to check whether the GR posting date has met your SLA or not, you should fetch all the details only when GR is created and make your key field as PO-PO Line item-GR

  • FR Layout issue with large number of columns

    Hi!
    I'm developing a report in FR 11.1.1.3 with over 30 columns.
    The issue is that when I run the report in web preview, the dropdown of dimension in page goes to the far right and disappears from the display.
    If I reduce the number of the columns I don't have this problem.
    I've already tried to maximize the workspace to the maximum without any result.
    Can anyone help me to deal with reports with large numbers of columns?
    Regards,
    Luís
    Edited by: luisguimaraes on 13-Mar-2012 06:48

    IE8 could be the reason. According to the supported platform matrices (http://www.oracle.com/technetwork/middleware/bi-foundation/oracle-hyperion-epm-system-certific-2-128342.xls), check tab EPM System Basic Platform, row 70, in order IE8 to work, FR and Workspace should be patched.
    FR Patch number: 9657652
    Workspace Patch number: 9314073
    Patches can be found on My Oracle Support. Just search for the patch number.
    Cheers,
    Mehmet

Maybe you are looking for

  • Delivery Date with Zero Order Quantity in Sales Order

    Dear All, I have created a sales order in va01 with different delivery date in schedule lines. I want to save delivery date with zero order quantity because he has not delivery on time.

  • Whether Forms/Reports Service is present in Midtier of Oracle AS 10.1.3

    Hi, Can anybody confirm whether Forms/Reports Service (i.e. Business Intelligence and Forms/Reports Service) is present in Midtier of Oracle AS 10.1.3? How do I check this fact from Oracel documentation. Regards Suresh Shinde

  • Trying to print borderless photos on F380

    I have a deskjet F380 and HP photosmart essential software.  I have been trying to print 4x6 borderless photo's.  Each time I try I keep getting a white border on the righthand side of the photo.  I have checked that all the settings are for borderle

  • Select from xmltype multiple rows not having XMLSequence in my db

    Hello every one i am new to use xml in pl/sql. we are using a db Oracle9i Enterprise Edition Release 9.0.1.4.0 but we do not have XMLSequence so that i can not use select like SELECT e.DOCUMENT.extract(value(d), '/xs:schema/xs:complexType/xs:complexC

  • I got a swedish cellphone in USA can I use skype t...

    I have my swedish cellphone with me to the US and I need to use the skype to go, shall I type in that I´m calling from sweden or where? Does the skype use the phoneline calling or will there be a cost for roaming?