Any APO Vehicle Scheduling DataSources in BI Content?

Hello all.
I need to get data form APO Vehicle Scheduling
into BW. I can't find in BI Content help any content
objects suitable. Are there any?
If there is no, what views/tables should I use for
generic DataSources?
Thanks,
Timofei

Hema,
Usually the Asserttion_FAILED dump is due to PSA - delete the data in the PSA Tables - Ideally using SE14 and deleting / dropping the data in the tables and then check the same.
Warm regards,
Arun Varadarajan

Similar Messages

  • Vehicle Scheduling Error:Error when creating transport structures in live C

    HI Gurus,
    We have one background job defined for Vehicle scheduling, which shows completed successfully but in log it says that :
    : Error when creating transport structures in liveCache and Not possible to schedule shipment in the planning horizon.
    My planning horizon is for 90 days from current date and demand horizon is for 30 days.
    I have checked the VS profile and i dont find any problem with profile.
    Can you please help me in finding what exaclty went wrong
    Thanks,
    AMol

    Try This..
    1. Open the crystal client on your local machine and import that particular report.
    2. Do Database -> Verify Database. Then refresh the report on your PC and make sure that it returns data as expected.
    3. Export it back to repository and try scheduling it again. You can verify through CMC for Crystal Reports under " Database Configuration" property to make sure it points to the right ODBC.
    Let us know.

  • Scheduler in servlet sending content twice

    First of all, this is what my servlet is doing.
    My servlet is used with a Quartz Scheduler to periodically send sms content to subscribers. I use,
    -MySQL = to store the content and subscribers' phone number
    -Quartz scheduler = to do scheduling for delivery of contents
    -Log4J = to log all delivery of contents
    The problem: My servlet sends the content twice to a subscriber. It suppose to just send once to each subscriber.
    Config:
    Schelude of delivery is every Wed and Thur
    Log file : cms.log
    MySQL URL connection: jdbc:mysql://178.547.22.345/best_cms?user=best_root&password=pass&autoReconnect=true&useUnicode=true&characterEncoding=ISO-8859-1
    So far I�ve done some investigation and it seems that,
    -When the scheduler is triggered, the app will start to send a content to the listed subscribers. Say 1000 subscribers subscribe to monophonic ringtone, then I shall set the scheduler to send one monophonic ringtone to the subscribers.
    -During the time that the app is sending out the subscribed content, every so-called transaction, I log the phone number and the content that was send. That means if I have 1000 subscribers, I would have 1000 logs. I log all this in a file called cms.log. And I set the file size limit to 500kB. That�s means when the log file reach 500kB, it shall rename the file to cms.log.1 and continue it�s log in a fresh new cms.log file.
    -Ok, the problem comes when the app begins logging the delivery of the subscribed content. What happens was that, let�s say my logging stops half-way in the cms.log file. So when the scheduler is triggered, my app will start to log the delivery of the subscribed content, right? And when it starts to log into the cms.log file, it starts to log each transaction twice. And the app also did actually send the content twice to each subscriber, based on my db log!
    -But the funny thing is this. When the cms.log log file reached 500kB, and started a new fresh cms.log file, then the app behaves back to normal, meaning that it only log the transaction once, which also means that the content is sent only once.
    -My �painstaking-to-solve� problem is that, why does my app send the content twice before the logger refresh to a new log file?
    I�ve done triple check on my SQL statements, (even putting the word DISTINCT in), but to no avail. My troubleshooting ideas are that,
    -Could it be a JDBC connection problem? Could I solve the problem by putting a delay between connection to the MySQL and query?
    -Could it be the log4j problem? If I stop logging the delivery of content, will it solve the problem?
    -Or could it be the scheduler problem?
    Please help.
    Here's the code
    Mel

    Well, yeah.
    I put myself on the subscriber's list and i really did received the content TWICE.
    Another thing that I just found out was that, there's no problem with the logging. I got rid of the logging and still my db log shows double transactions - row by row.
    My hoster asked me to update to the latest JConnector for MySQL and I did. But the problem still persists.
    Any other ideas on how to debug or fix?
    Mel.

  • Vehicle scheduling

    Hi all,
    In vehicle scheduling (/sapapo/vs01) the shipments for a particular optimisation profile are displayed. All the data about these shipments are stored in the table /sapapo/vs_lcmap. (i guess so).
    But the activity start date/time (the day/time at which the shipment starts) in vs01 is different from the one in the sapapo/vs_lcmap table.
    Can anyone explain me why is it so? Or is there any other table that contains the data about shipments.
    If my question is not clear, please let me know.

    Hi Aarthi,
    table /SAPAPO/VS_LCMAP is the shipment master table and already contains some information, but information about the real shipment activities like loading, transportation and unloading is stored somewhere else.
    - If you use SCM 5.0 the shipment activities with all the start and end times (in UTC) are stored in DB-table /SAPAPO/VS_SHPA.
    - If you use a release < 5.0 the information is stored in LC at the relevant orders. You can use //OM16 to display the order and then branch to the activities.
    The value for activity start/end in LCMAP should be the begin of the first activity of the shipment (normally empty leg or first loading activity) resp. the end of the last shipment activity (normally last unloading activity or empty leg).
    Hope that helps.
    Regards,
    Sebastian

  • Installation of Datasource from Business content

    Hi Friends,
    While Installing Datasource from Business content in SAP R/3 encounter with following error message. Can some one guide me how to transfer the hierarchy with Business Content. Appreciate your help in resolving this problem.
    Application component FI-GL of DataSource 0FI_TX_4 does not exist
    Message no. R8418
    Diagnosis
    You tried to assign a DataSource to the application component FI-GL. This application component is, however, not entered in the RODSAPPL table as an application component.
    System Response
    The DataSource is saved, but is not visible in BW under the node NODESNOTCONNECTED.
    Procedure
    You do not have the option yet to transfer the application component hierarchy from the Content.
    You transfer the hierarchy with Business Content.
    Thanks & Best regards
    Pravin

    Sorry if a year has gone by.
    This is because you  didn't heve the Application Component Hierarchy installed in BW itself.  Use rsa6 to do this, in the BW system (not the other source system.  Be careful if you have already somehting there, because it will override the hierarchy with the Business Content hierarchy.
    If you need to transport the active hierarchy, and not reinstall all the Business Content Datasources, the object is :
    R3TR DSAA APCO_MERGED
    So, create a workbench transport, and add the above object.

  • HT201250 Is there any way to schedule backups on Time Machine?  I hate it that backup happens every dang hour.  I would like to have it back up at night.

    Is there any way to schedule backups on Time Machine?  I hate it that backup happens every dang hour.  I would like to have it back up at night.

    Richard Campbell2 wrote:
    I hear you, I do.  But the problem is that with my Mac, I have to stop and postpone whatever project I am working on while the backup occurs.  It just slows down my computer.
    Then something is wrong with your backups.  Changing the interval will only deal with they symptoms, not the actual problem, and you won't be as well protected as with hourly backups.
    If the backups are much larger than they ought to be,  see #D4 in Time Machine - Troubleshooting.
    If the sizes are reasonable, but it seems to take too long, see #D2 there.

  • Is there any standard ptogram/tcode to download idoc contents with fields.

    Hello All,
    Is there any standard ptogram/tcode to download idoc contents with fields?
    Thanks,
    Devaraj.

    Hi Patil,
    I dont think there is standard program or t-code.
    But u can down load the IDOC with field contents by following way
    Goto t-code WE62- give the basic idoc name- press display button  then it shows all filed contents with description.
    Now select 'SYSTEM' LISTSAVE--LOCAL FILE from menu bar .
    Here select the file type ie spread sheet,html, etc and give the file name and directory where u need to place it.
    <b>Rewards with points if helpful.</b>
    Regards,
    Vijay.

  • APO/BW: Creating datasource in BW for APO

    Hi
    1. In creating a datasource (in BW) to be used in APO by righting on the cube in question and selected the option “Generate export Data Source”. It created a datasource 8CubeName, which
    could be found under the InfoSource tree in BW. At this point, will APO see 8CubeName?
    2. Where in  BW do we specify that it should go to APO? What if there are other systems which can also use this same datasource and how do we specify which one?
    3. I also understand that update rules can be created on APO side, if so, what becomes of the updated rule created on the BW end? In other words, if update rules are applied on the APO side, and also applied on the cube on BW side, what happens?
    4. Ok, so if I understand it right, this process creates a structure for datasource in BW, which is seen on APO DP. But which process actually pushes data from BW to APO DP?
    Thanks

    Hi Amanda,
    I believe most of your questions will be answered by understanding that inside APO/SCM itself there is BW system, that works just like the BW that you had worked and understand well, mostly the data staging from BW to and from APO are like interaction between 2 BW (datamart), now try RSA1 in your APO system
    1. Yes, APO will see the datasource after you replicated in APO from BW as source system; in APO we create source system with RSA1 like in BW, transfer application component (RSA9), this datasource will be displayed under 'Datamart'.
    2. It just like BW act source system for other BW system, we have no need to specify in BW which one to go, but in APO we will use this datasource to assign to infosource in APO
    3. Again in APO it just like a 'mini' BW there, so update rules in APO are indepedent used for APO-BW, nothing will happen to existing BW
    4. Create infosource in APO and assign datasource from BW, create infopackage.
    You can use process chain as well, yet in APO we have more process type (specific for APO usage like generate CVC, adjust time series, etc)
    http://help.sap.com/saphelp_scm50/helpdata/en/8a/9d6937089c2556e10000009b38f889/frameset.htm
    http://help.sap.com/saphelp_scm50/helpdata/en/13/5ada58309111d398250000e8a49608/frameset.htm
    hope this helps.

  • Is there any way to Schedule URL in IIS or Schedule Task

    Hi All,
    I really need an urgent help.
    We have more than 5 web application running in IIS.Most of them are ASP.Net applications and 1 is Cold fusion application.
    There is a facility in cold fusion Administration to schedule the URLs.We are using the same facility for schedule purpose(15 URL are scheduled).
    But now we need to get rid of cold fusion.
    Is there any way to schedule the URL's in windows server?
    Appreciate your help.
    Thanks,
    Joji K John 

    I'd probably ask them over here.
    http://forums.iis.net/
    http://forums.asp.net/
    Regards, Dave Patrick ....
    Microsoft Certified Professional
    Microsoft MVP [Windows]
    Disclaimer: This posting is provided "AS IS" with no warranties or guarantees , and confers no rights.

  • Any recommendations or references for managing video content in iMovie 11?

    Any recommendations or references for managing video content (Events, Clips, Thumbnails, Projects, etc.) in iMovie 11?
    I just finished importing all my digital videos and understand the .mov files will be large, but want to make the most efficient use of my disk space and iMovie 11 as possible.  Any recommendations or references to other resources will be welcomed.

    My suggestion would be that in the future, they upload photos only to iphoto, and upload video clips directly to iMovie. In the early days of iOS, going through iPhoto was the only way, so it is a habit that I have had to break.
    With iOS 5 and 6, iMovie now recognizes the iPhone and the iPad as cameras.
    I agree that the Consolidate Media command should bring over iPHoto movies along with the photos. I suggest that you go to iMovie/Provide Feedback and let the developers know that you would like this feature.

  • When I get messages from Barnes and Noble, the text is blank. If I hit reply or forward, I can then see the content. This only happens with Barnes and Noble. Any suggestions on how to view the content?

    When I get messages from Barnes and Noble, the text is blank. If I hit reply or forward, I can then see the content. This only happens with Barnes and Noble. Any suggestions on how to view the content?

    I'm sorry, but your sister, unless she had already turned on the "Find my iPhone" feature and the person who took the phone has not disabled it, is out of luck. She should report the theft to local police authorities, including the serial number of her iPhone. While her experience is unfortunate, there are good reasons why Apple cannot do anything else about it.
    I hope she gets her phone back.
    Best of luck.

  • FI Spend Data for SRM, Has any one Generated the DataSource from SPL?

    Hello Friends: I am planning to activate the standard BI Content forSpend Analysis. Specifically the infoCube: 0SR_FICO1 and the DSO 0SR_FIDS1.The online documentation recommends creating a Special Purpose Ledger in the source system and supplies a sugessted extract strucure.
    My question is, has anyone created a Special Purpose Lledger using this extract structure to genrate the DataSource in the SAP source system?
    If you have, do you have any items or topics you encountered that you could share with me?
    Regards,
    Joe G.

    HI,
    Please check this link, it may help full.
    [Spend Analysis|http://help.sap.com/saphelp_nw70/helpdata/en/41/61a76556455288e10000000a114e5d/frameset.htm]
    Regards,
    Madhu

  • Restrictions on APO Network scheduling?

    Hi,
    We are in the process of researching the restrictions of using APO to perform detailed scheduling on network orders.  The last document I found (SAP note 708517 way back from 2005) listed the following restrictions present in SCM 4.1  - I need to know if these still exist in more recent versions, like 7.0:
    -No transfer of relationships across networks. Relationships across
    networks are also used for Subnetworks. So subnetworks will be
    treated as independent APO project orders.
    -No transfer of activity elements from R/3 to APO
    o No transfer of relationships across networks. Relationships
    -R/3 activity constraints (e.g. 'must start before') are not taken
    into account
    -Manual requirement dates will not be fixed in APO, but will be
    transferred as offset times between activities and manual component
    requirement date to APO.
    Thank you very much for your help,
    Andrew
    Edited by: Andrew Schutsky on Dec 15, 2010 5:43 PM
    Edited by: Andrew Schutsky on Dec 15, 2010 5:43 PM

    Scott has a lot of important things to consider listed. I agree with him that you must make sure you know what you are getting yourself into. My thoughts to add are:
    1. I would highly recommend some sort of bandwidth limiting. Unless you have an unlimited pipe to the Internet, someone will try to run bittorrent and chew up your bandwidth.
    2. I would definitely disallow peer-to-peer communications on your wireless to avoid security issues Scott mentioned. (if you use wireless for VoIP, this will be a problem. you cannot currently disable peer-to-peer on a per SSID basis).
    3. You need to have a good disclaimer page. If you have a lawyer, I would put the task to them to see what you have to include to protect yourself. If you are using LWAPP APs, you can use the built-in capabilities to present a splash page with your terms and an accept button. You can do this without having to do user authentication.
    4. I agree with Scott about not claiming to filter, but if there are any schools nearby, or even kids living nearby, you may be at risk if you do not filter. My customers go different ways on this one. Some filter and say they filter. Some do not say they filter, but do anyways. Some leave it wide open. This is a liability question you need to work out with someone who knows more about the law than I do.
    5. You absolutely should restrict available ports. I have set up public hotspots for many public sector organizations, particularly schools. We always restrict ports, usually just to http, https, and DNS, and sometimes VPN. Do not allow SMTP unless you want to be blacklisted. Spammers are starting to use open WiFi networks more and more to send untraceable spam.
    6. As Scott mentioned, make sure your ISP is ok with it. If you have business service, it should not be a problem. If you are using residential DSL or Cable modem, they almost all forbid sharing your connection.
    -Eric

  • Is there any way to schedule the webi report as body of the mail

    Hi All,
             I have a bit different requirement. I need to schedule the webi reports to BlackBerry Mobile .
    The report should be displayed not as an attachment or URL but should be the body of the mail.
    Is there any Trick or Tip to do that.
    Thanks in Advance

    You could schedule the reports to run as an Excel or PDF document and the destination as a folder on a server.  Then after the reports are sent to the destination, you could use the Program Job Server to run a script which parses the files line-by-line and creates an email.  It's not the most elegant solution but in theory it should work.

  • Dynamic datasource query in Content Presenter

    I have a Documents region with a custom Content Presenter template.
    I would like to use a dynamic datasource query with this, which makes use of the currently logged in user and the user's role.
    The task flow parameters look like this now:
    <parameters>
    <parameter id="taskFlowInstId"
    value="${'afd37bc3-bd2e-4e97-b838-74c975529633'}"/>
    <parameter id="datasourceType" value="${'dsTypeQueryExpression'}"/>
    <parameter id="datasource"
    value="${'SELECT * FROM ora:t:IDC:GlobalProfile WHERE ora:p:dDocType = \'Dagbericht\''}"/>
    <parameter id="templateCategory" value="${''}"/>
    <parameter id="templateView" value="${'dagberichten.list.template'}"/>
    <parameter id="maxResults" value="${'3'}"/>
    </parameters>
    What are my options to achieve this?

    Jaap,
    You can use expression language in that query to make it dynamic.
    You could either build the full query from a managed bean or just some parameters.
    here's how you can do it when building the query from a MB:
    This should be the method in your bean:
    public String getQuery(){
      String user = ADFContext.getCurrent().getSecurityContext().getUserName();
      String query = "SELECT * FROM ora:t:IDC:GlobalProfile WHERE ora:p:dDocType = 'Dagbericht' and ora:p:dCreatedBy = '" + user + "'";
      return query;
    }Your datasource parameter should be something like this:
    <parameter id="datasource" value="#{yourBean.query}"/>note: the "and ora:p:dCreatedBy..." might not be correct but it shows how you can dynamicly build the query.
    Hope this helps.

Maybe you are looking for