BPEL in PeopleSoft Data Conversion Process

I am new to BPEL and my requirement is to use BPEL to extract data from PeopleSoft to flat files and load to a 3rd party external system. The same requirement used to be accomplished by SQR...
Can someone advise how BPEL fits in this convesion process? I am reading documents but still not very clear. If there are any online resource, that would help too.
Thanks in advance!
Max

1. what are the basic steps i need to manually configure in new box before i do LSMW --- depends on how system is setup - normally you define number ranges manually           
2. what i need to do with number ranges so that assets will have the same asset numbers. --- Define you number ranges to generate numbers externally (not internally) and you can load the "old" SAP asset number in new box as it is.
3.what are the steps to bring asset balances --- read LSMW and/or search on this forum. You will find lots of threads discussing the same.
4. what are the steps to continue do depriciation in the new system from the remaining useful life ( Ex. Lets say i have asset worth 1000 dollers and use ful life is 5 yrs. lets say 2.5 yrs is depriciated and the current balance is 500. But now iam moving that asset to new instance so that it should continue from 2.6 month and continusly run depriciation.
Once you load APC, Accum Depr and the original useful life of the old asset SAP will automatically start depreciating from the current month onwards based on the remaining useful life/months.
Please always search on this forum before you ask.

Similar Messages

  • GL Legacy Data Conversion

    I have a question for the data conversion strategy.
    We are planning to store 2 yr detailed transactions and 4yr balances in Oracle system. In terms of the data conversion process, we were also going to take the same method. It would, however, cause schedule and workload constraints. Actually, we have not yet discussed this from realistic view points. In other words, we do not insist on storing all of data which are 2yr transaction and 4yr balances.
    Although I believe that we will be able to eliminate a volume of the data to be converted to new chart of accounts from old ones, I am a little bit concerned about a couple of things as follows:
    1. the workload for reporting processes
    Assuming that we will not use Oracle standard reports so much, it would not be a big issue, even if we do not convert all of data I mentioned above. We will store historical report data somewhere, and be able to generate certain reports using both Oracle data and historical data in the repository. If not, it could cause us extra efforts.
    2. Audit trails / Examination trails
    If we give up converting the data to fit new system, and that fiscal year has not been examined yet, how should we handle non-converted data for the examination? I am just wondering whether or not, we have only to prepare the conversion table, that ties Oracle balances back to ABC detailed transactions for their reference.
    If anyone is are aware of anything you can advise me, could you please provide some information or guidance? Thanks to ALL.

    Hi,
    When you talk about GL-Data it is the trial balance to be loaded from legacy to oracle applications.
    You can use Web-ADI to upload it ,the check list are
    The balance for each account comibnation in the legacy system to be mapped to GL-Oracle code combination balance
    With respect to open AR, AP Invoices if the invoices are converted using a control account for migration in GL then
    the the balance transfered from AP,AR need not be reversed in GL from the source receivables and payables.
    In case if the same account combination are for migrating the balances from AP,AR to GL ..Then the balances transferred from AP,AR should be reversed ..so that it does not affect the TB-GL-Balance
    Finally ensure that the balance for the TB-tallys with your legacy system and upload it using Web-ADI ,import and
    review and post it.
    Hope this points helps your GL-Data conversion.
    Regards,
    Ramaa

  • Service contract data conversion

    Hi,
    Where can I get documentation for service contraction data conversion process.I mean interface tables or api's etc.
    Thanks
    ram

    Can you please specify where can I find stepwise explanation for Service Contracts using APIs. Which Oracle API ref guide you are reffering here.
    rgds

  • Data Conversion rules for EDI processing (same client IDOC processing)

    Hi,
    I am trying to post IDOCS in same client.Its a PO->SO process.
    ie. there will be 1 outbound and inbound idoc in same client using EDI processing.
    I am using Data Conversion using Rulesfor converting sender fields.
    The LIFNR and PAORG od segment E1EDKA1 has to be converted.
    For ALE processing, the Data conversion is been done correctly.
    But no conversion is done for EDI.
    Can anybody help me with this problem ?
    Thanks in advance.
    Regards
    Megha

    Issue solved

  • Is there a way to stop and then resume an ipod video conversion process?

    I am trying to convert videos to play on my new iPod Classic. The conversion process is painfully slow. I need to move the computer. Is there a way to stop the conversion process and then resume it later where you left off? The only way I have been able to figure it out is to put the laptop to sleep and then open it up later to resume. This is a problem too as I have an external USB drive hanging off the side. Any ideas?
    Thanks all in advance for your help.

    AFAIk you can just uninstall Thunderbird and reinstall. The profile will not be deleted. If you want to make sure, however, make a backup of your profile folder first.
    Read up on "backup", e.g. here : http://email.about.com/od/mozillathunderbirdtips/qt/et_backup_prof.htm
    (google is your friend :9 )
    P.S. _ Matter of fact you should regularly backup your data (e.g. your thunderbirdprofile) to some place different so as not to loose it all, if a thunderbird-bug/virus/windows-failure/Layer8-Problem corrupts your data.

  • Data conversion strategy for new SOB

    Dear Viewers
    on 11.5.10
    We are creating a new SOB with a change in currency from Feb-11 as this is the requirement
    For the same, we need to do data conversion.
    I have a confusion for Purchase Orders and Sales Orders
    Purchase Orders:
    Open purchase orders will be converted, means the unfulfilled PO`s i.e the ones not received and are fully open.
    The PO`s which have been recieved but not delivered, Requested the users to clear the intransit receipts.
    The PO's which are partially received, what to be done for them?
    If a PO is fully received and Delivered will not me converted to the new SOB as its not an open PO
    but If invoice comes after Feb-11, than how the matching will be done?
    What if a return has to be made moving forward in FEB-11 under new SOB
    Sales Orders:
    Open sales orders will be converted, that is the ones that have been entered and not yet booked.
    Users have been requested to clear off the Sales order lines which are already pick confirmed but not yet shipped, hence they will be shipped and interfaced to AR
    For the Sales orders that have been booked, those lines that are not yet processed further will also be converted.
    Now what if a receipt comes after feb 11, how to handle this as the sales order wiould not have been converted?
    Please give your advise on the data migration strategy for PO`s and SO's.
    Please do add any point that may have been missed by me
    Appreciate your help
    Thanks
    Emm

    Hi David,
    for master data conversion you can use LSMW and the RE-FX BAPIs. (please refer to SAP note  [782947|https://service.sap.com/sap/support/notes/782947] ).
    Regards, Franz

  • Data conversion for new sob

    Dear Viewers
    on 11.5.10
    We are creating a new SOB with a change in currency from Feb-11 as this is the requirement
    For the same, we need to do data conversion.
    I have a confusion for Purchase Orders and Sales Orders
    Purchase Orders:
    Open purchase orders will be converted, means the unfulfilled PO`s i.e the ones not received and are fully open.
    The PO`s which have been recieved but not delivered, Requested the users to clear the intransit receipts.
    The PO's which are partially received, what to be done for them?
    If a PO is fully received and Delivered will not me converted to the new SOB as its not an open PO
    but If invoice comes after Feb-11, than how the matching will be done?
    What if a return has to be made moving forward in FEB-11 under new SOB
    Sales Orders:
    Open sales orders will be converted, that is the ones that have been entered and not yet booked.
    Users have been requested to clear off the Sales order lines which are already pick confirmed but not yet shipped, hence they will be shipped and interfaced to AR
    For the Sales orders that have been booked, those lines that are not yet processed further will also be converted.
    Now what if a receipt comes after feb 11, how to handle this as the sales order wiould not have been converted?
    Please give your advise on the data migration strategy for PO`s and SO's.
    Please do add any point that may have been missed by me
    Appreciate your help
    Thanks
    Emm

    emm wrote:
    Purchase Orders:
    Open purchase orders will be converted, means the unfulfilled PO`s i.e the ones not received and are fully open.
    The PO`s which have been recieved but not delivered, Requested the users to clear the intransit receipts.
    The PO's which are partially received, what to be done for them?
    If a PO is fully received and Delivered will not me converted to the new SOB as its not an open PO
    but If invoice comes after Feb-11, than how the matching will be done? <br>Business needs to take a decision whether they are fine with matching the POs manually (i.e. referring the documents and verify), in this case you may capture the PO information in a DFF in the Invoice distribution. Otherwise if it has to be converted identifying the POs under this scenario, you may consider converting those maintaining receipt close tolerance as 100% and matching type as 2 -way (again business approval needed to handle audit issues) in order to avoid the receipts/delivery conversion etc.<br>
    What if a return has to be made moving forward in FEB-11 under new SOB<br>Ideally returns can be done using Miscellaneous/Account Alias Issues specifying the appropriate transaction reasons to clarify the scenario.<br>

  • Data conversion for New GL - Going live during fiscal year

    Hi Experts,
    My client is going live on Oct 1 (fiscal year in Jan -Dec). So for data conversion we need to load P&L and BalanceSheet as of Sept 30, 2009.  I need your help to resolve few open issues. (we have activated Doc Splitting with Zero Balance and Segment )
    1) When we load P&L balances - how can I assign Profit Centers to all these P&L iitems .  
    2) What is the procedure (best practices) to assign Profit Centers to all the Balance Sheet items since we are going live during Fiscal Year
    3) When we process open items  (AR, AP) in month of October how the Profit Center assignment will work?
    (My client will be using Profit /Cost center terminology first time, so we need to map all the B&S and PnL items to Profit centers)
    4) What is the best way to map all the Balance Sheet GL accounts to Profit Centers.
    Sorry for asking too many questions. Any help would me very much appreciated.
    Thanks,
    Sam

    1) When we load P&L balances - how can I assign Profit Centers to all these P&L iitems .
    Each P&L account balance has to be upload with cost centre, then automatically PC will be triggered.
    2) What is the procedure (best practices) to assign Profit Centers to all the Balance Sheet items since we are going live during Fiscal Year
    Let client decide the profit centre for balance sheet items, since they knows which balance relates to which pc
    3) When we process open items (AR, AP) in month of October how the Profit Center assignment will work?
    While uploading AR,AP balances, each line item to be uploaded with profit centre, automatically in october PC assignment will work.
    (My client will be using Profit /Cost center terminology first time, so we need to map all the B&S and PnL items to Profit centers)
    4) What is the best way to map all the Balance Sheet GL accounts to Profit Centers
    Try to identify each balance sheet gl balance to each profit centre, this will help in future reports. otherwise put in common pc

  • Data Conversion Errors for the last week

    We've been running a simple Stream Analytics job for little over a month now with a very light workload. Input is Event hub and output SQL Server. We noticed today that we haven't received anything into SQL Server since 2014-12-08 (we don't receive events
    every day so we only know that everything still worked on the 8th of December), so we checked the job's logs. It seems that job is failing to process all the messages: The value of "Data Conversion Errors" is high.
    I wonder what could have happened? We haven't touched the client since we started the job so it's still sending the messages in same format. And we haven't touched the job's query either.
    Has there been an update to either to Stream Analytics or to Events Hub which could cause the issue we're seeing?

    I've followed word for word the TollApp Instructions (except the thing with NamespaceType "Messaging" that has been added to New-AzureSBNamespace).
    I have 0 line in output, and this is the service log:
    Correlation ID:
    e94f5b9e-d755-4160-b49e-c8225ceced0c
    Error:
    Message:
    After deserialization, 0 rows have been found. Possible reasons could be a missing header or malformed CSV input.
    Message Time:
    2015-01-21 10:35:15Z
    Microsoft.Resources/EventNameV2:
    sharedNode92F920DE-290E-4B4C-861A-F85A4EC01D82.entrystream_0_c76f7247_25b7_4ca6_a3b6_c7bf192ba44a#0.output
    Microsoft.Resources/Operation:
    Information
    Microsoft.Resources/ResourceUri:
    /subscriptions/eb880f80-0028-49db-b956-464f8439270f/resourceGroups/StreamAnalytics-Default-West-Europe/providers/Microsoft.StreamAnalytics/streamingjobs/TollData
    Type:
    CsvParserError
    Then I stopped the job, and connected to the event hub with a console app and received that:
    Message received. Partition: '11', Data: 'TollId,EntryTime,LicensePlate,State,Make,Model,VehicleType,VehicleWeight,Toll,Tag
    85,21/01/2015 10:24:56,QBQ 1188,OR,Toyota,4x4,1,0,4,361203677
    Message received. Partition: '11', Data: 'TollId,EntryTime,LicensePlate,State,Make,Model,VehicleType,VehicleWeight,Toll,Tag
    33,21/01/2015 10:25:42,BSE 3166,PA,Toyota,Rav4,1,0,6,603558073
    Message received. Partition: '11', Data: 'TollId,EntryTime,LiMessage received. Partition: '10', Data: 'TollId,EntryTime,LicensePlate,State,Make,Model,VehicleType,VehicleWeight,Toll,Tag
    59,21/01/2015 10:23:59,AXD 1469,CA,Toyota,Camry,1,0,6,150568526
    Message received. Partition: '10', Data: 'TollId,EntryTime,LicensePlate,State,Make,Model,VehicleType,VehicleWeight,Toll,Tag
    25,21/01/2015 10:24:17,OLW 6671,NJ,Honda,Civic,1,0,5,729503344
    Message received. Partition: '10', Data: 'TollId,EntryTime,LicensePlate,State,Make,Model,VehicleType,VehicleWeight,Toll,Tag
    51,21/01/2015 10:24:23,LTV 6699,CA,Honda,CRV,1,0,5,169341662
    Note the bug on the 3rd message. In my opinion it's unrelated, it could be the writeline that can't keep up with the stream in the console application. And at worst it's in the stream, but then I should see at least some lines in output for the correctly
    formatted messages.

  • Data Conversion Design Patters

    I'm looking at building a conversion program that will import data from several different formats and convert into one common format. The convertor should simply be pointed to a database or a flat-file and it will extract the data and populate tables in the target database. All the logic is mind-numbingly simple, but as far as an overall design, what are your thoughts?
    For example, should I build a separate tool that does validation of source data? Or should the validator be part of the convertor?
    Are there design patterns out there that anyone can recommend?

    This problem u can go for strategy pattern. A strategy is nothing but an algorithm to reach the soution.
    Strategy pattern deals with different algorithms(strategies) to achieve the same result.The client will have some indicator. Using the indicator, stategy manager will decide which algorithm to invoke.
    In ur case,the algorithm may differ depending on the data format. For (eg) if you have five different formats of data , then you may have to write five different type of logic to achieve the solution.
    So firt of all u need to identify the strategies u r going to use.
    Now you may require following classes
    1) SourceReader -- > this class is responsible for getting the data either from file/database.It just reads and holds data.
    It identifies which strategy to use by parsing the data . No data processing happens here
    2) StrategyManager --> Strategy manager is the one who reads the data from source reader and decides and instantiate the strategy to use.
    3) Strategy --> It could be an interface or an abstract class. It can be decided upon any common operation involved irrespective of data           format.
    4)ConcreateStrategy --> This implemenattion class, here actual data extraction process will happens.
    5) DataLoader --> this calss loads the data in to the destination database
    6) YurEntity --> The data to be populated can be kept in the form object. if ur object contains many attributes, we can have data in the           form objects. (this class is required only if all the data are realted and fall in to same logical group).
    Class SourceReader{
    Vector data;
    public void do(){
    readData()
    StrategyManager.getInstance().process(data,findStrategy());
    public void readData()
         read from file or db and populate data
    public int findStrategy()
         logic for identifying format
    class StrategyManager
    public static StrategyManager sm = new StrategyManger();
    public StrategyManager getInstance()
    return sm;
    public void process(Vector data,int indicator)
         //read the indicator and instantiate the appropriate startegy class
         Strategy s = new Strategy1();
         s.process();
    abstract class strategy{
    public abstract void parse();
    public void validate{};// if the validation logic is common, u can have implemenation here itself.
    public updateDB()
         update destination db;
    public void process()
         validate();
         parse();
         updateDB();
    }

  • Data conversion of open checks

    Hello,
    We are preparing our data conversions for a new implementation of SAP in the US. A question came up as to whether we should enter check masters for checks we have printed and mailed to our vendors but haven't been cashed at the time of go-live. The other option would be to put the balance of these open checks into a GL account and manually enter journal entries when they are cashed.
    Our normal business process will be to use FCHR to process a file we receive from our bank with cashed checks. If we don't create check masters for our open checks, then FCHR will not be able to automatically process cashed checks from before go-live.
    My question: is it best practice to load these open checks during the data conversion into SAP so that FCHR will clear them out, or is this normally handled as an exception using manual journal entries?
    And, if it is best practice to load the check masters in SAP, what transaction code would we use to enter the open checks?
    Thanks,
    JB

    HI
    We have developed an ABAP program to set deletion indicatior in PO in mass. You can do the same by consulting your ABAPER.
    Regards
    Anil

  • Data Load Rule file -Date conversion

    Hi,While working on a Dataload rule file,I was facing this problem.I'm getting date in the format "m(m)/d(d)/yyyy hh:mm:ss". Is there a way to change this to "mm/yy" ??(There won't be anyproblem if I get mm/dd/yyyy hh:mm:ss style. but unfortunately not.)

    Can you run the file through a "conversion process" prior to loading?We do similiar thing here. We get a feed from Hyperion Enterprise and run it through a home grown conversion utility written in Windows Script before we load into Essbase.It reads in the file line by line and then writes out a new file properly formatted.

  • How to catch failed rows from excel export data conversion

    I am pulling data from SQL Server and exporting to Excel file.  Using SSIS 2008, sending to Excel 2003.  The process is working fine, and I want to grab any data conversion failures, specifically I want to grab any data that fails or is to be truncated.
    I add a flat file destination to the data conversion error line (red) and pointed it to a txt file.  This caused an error, saying some of the columns were the wrong data type to go in a text file.  So I added a data conversion to the first data
    conversion error line, but the data types wont change.  
    The wierd thing, is the error says the columns are DT_NTEXT and need to be DT_TEXT, but they aren't, they are DT_WSTR.  Anyway, I tried to convert to DT_TEXT and it caused the data conversions in my original conversion to change, whcih broke the whole
    package.
    My intention is to grab the erroring row so it can be manually converted. So how do I do that without adding 100 more errors?

    Hi teahou,
    do you really use two accounts to post on the MSDN forums?
    I think the data types were not guessed correctly by the Flat File Destination component and thus you need to adjust them using the advanced editor, then naturally the data conversion transformations become obsolete.
    Arthur
    MyBlog
    Twitter

  • Non PeopleSoft data into EPM Warehouse

    I've probably placed this question in the wrong forum and for that I apologize.
    I'm brand new to PeopleSoft in general and EPM in particular. We are in the process of installing the app to begin pulling data into EPM from our PeopleSoft module. But one of the first major pieces of data that we are going to need in our EPM warehouse is from an in-house written application. Can anyone point me to any applicable best practices or documentation for including this non Peoplesoft data? I've searched and not had much luck, so I'm apparently looking in the wrong place.
    Thanks,
    Alan Junell

    It may be worth having a look at the supported systems for ERPi - http://docs.oracle.com/cd/E17236_01/epm.1112/erpi_admin_11121501/ch01s01.html
    Though your question sounds like what you are trying to achieve initially is to load non peoplesoft data, it depends what product you are trying to load to and what format the data is in before really options can be given.
    Cheers
    John
    http://john-goodwin.blogspot.com/

  • BPEL Component not spawning SOA processes

    I am trying to get a workflow running that passes a payload consisting of several ucm document metadata elements, a few custom elements, and the DocURL and idcReference to a SOA Composite with a BPEL Process on it.
    I have created the Composite in JDeveloper with the SOA plugin, and I have added <binding.adf registryName="" serviceName="soa"> to the service entry in the composite.xml just as the BPEL Component guide tell you to do.
    The BPEL Process wizard creates by default a <process_name>.xsd which is used as the payload for the process. It has 1 input field of type string. I replace that 1 field with dID, dDocName, dDocType, xFormType, xNotificationType, xPackageId, xSubmissionId, xBatchId, xBatchDate, docURL, and idcRef, all of type string.
    I have deployed it to soa-infra and created a Connection Configuration (tested, works) and Process Configuration with a Configuration ID of "soa" and an action pointing to my BPEL composite and process. I have created Mappings to the BPEL payload tying all of the pertinent UCM fields to their appropriate BPEL payload fields.
    I have created a criteria workflow with a step named SendToSOA which has scripts in it exactly like the example given in the BPEL Component Guide, except with "soa" where the Configuration ID should go instead of "process_3". I add another step afterward with review for the "weblogic" user so that I can look at the workflow before it finishes.
    I check in content of the appropriate type , and the workflow is engaged, and says that it passes the SendToSOA step, but when I go to soa-infra in EM and look to see if any new processes are spawned, I see no change. No errors or log messages are produced in any of the logs on the UCM or SOA Managed servers.
    I have rebuilt this process from the ground up about a hundred times. Once in a while, a deployment will start kicking off processes, with no consistent rhyme or reason. Sometimes, (maybe 1 time in 10) when I have built the BPEL component without changing the payload (meaning I can only pass one piece of data to my BPEL process....), the workflow will start spawning processes.
    Once, it started working randomly with a full payload, and it worked fine as a vehicle for 3 separate workflows for an entire day, spawning processes on SOA like a champ, but then because of a design change, I had to change the payload, and add 2 new fields. I deployed the changed BPEL process to SOA with the new payload, and when I went to the update the mappings on the Process Configuration, the Process configuration page made no effort to notice the new payload settings, and the Component Provides no way to tell the Process Configuration to update itself. At first I tried to update the payload in the <processconfiguration>.hda file under <domain>/ucm/cs/data/orabpel/process/, but when I did, the workflow stopped creating bpel processes on the SOA server. Then I tried deleting the configuration and rebuilding it, which did allow me to add the new payload fields in, but still, the workflow would not spawn SOA processes ever again.
    I have tried rebooting the UCM and/or SOA managed servers, I have tried rebooting them both, even between each configuration step. I have rebuilt everything from the ground up literally over a hundred times. I have tried it with changing the names of each of the components, from the composite to the bpel process to the adf binding service name to workflow. I have tried disabling and enabling the workflows, I have tries making the adf binding service name entirely unique, or matching the bpel wsdl client name, or matching the bpel process name, or matching the ucm process configuration id. I have tried many different ucm process configuration ids. I have tried simplifying the workflow script down to just <$obInvokeProcess("soa")$> or <$obConfigID="soa"$><$obInvokeProcess(obConfigID)$>. I have tried this whole process both on my development system and on a VM running UCM and SOA, both in linux and windows installations, the behavior is the same on both systems. I have tried it with Asynchronous, Synchronous and One-way BPEL processes.
    So far, there has been 5 instances of the workflow suddenly working without any explanation. twice with just the default single input payload created by the BPEL wizard in JDeveloper. Once when I had added one extra input field named input2 (string), but this was on the VM on my laptop, so not useful to my developement effert. I tried re-creating the exact same deployment on dev with the EXACT SAME steps, but on dev the workflow would not spawn SOA processes. Once, the process started working with only the script <$obConfigID="soa"$><$obInvokeProcess(obConfigID)$> in the Entry Eventof the workflow, and once with just <$obInvokeProcess("soa")$>, but trying to re-create this with full payloads failed.
    I am at wits end. there is little information on the internet or on otn or on Oracle Support or on intradoc_users yahoo group about the BPEL Component.
    Please help?

    Sorry for the confusion Vlad.
    With "And instance state is being shown as Running" i meant to say was : The instance state that we see in EM whenever we open our composite along with all other details like InstanceID,Name,ConversationIS etc.
    "BPEL component instance is not generated.": i meant to say was that whenevr we click on particular instanceid flow trace window pops up where we see the complete trace. I am here not able to see BPEL process .It didnt get invoke overall.
    Flow of BPEL as mentioned in my first post:
    1. DB is polled using DB adapter(logical delete)
    2. response from DB is passed to BPEL process for furthur processing.
    3. BPEL after some transformation publishes message in jms queue using JMS adapter.
    Hope it helps.
    Regards,
    Karan

Maybe you are looking for

  • Create and insert into table from Oracle to MS SQL server.

    Hello, Oracle Database 11g and Red hat 5 I have a very different kind of issue. I am handling the ORACLE db(remote db with all the important data). On the other side their is a MS SQL server db(local db with some testing data in it). All the users wi

  • Error in Dynamic LOV with Bind Variable

    Hi I created 2 Dynamic LOV's in which Second one is with a Bind Variable.Then I creted a Form and Attached the LOV's to the form fields.But I am getting the below mentioned error when i choose a value in the First LOV and the Second LOV is not Popula

  • Mac Pro and Core i7 Chip ?

    Does anyone have any idea what sort of advantage the Intel i7 chip will have over the current chip in the Mac Pro. Is it worth waiting until MacWorld in January for a possible announcement of the move to this chip. MacPro is due for a redesign and th

  • Subsequent Credit / Debit Memo functionality in SNC

    Hi All, Has anyone used the Subsequent Credit / Debit Memo functionailty in SNC along with normal Invoice Collaboration (Supplier creates invoices wrt PO or ASN from SNC and sends it to ECC). In our testing a Subsequent Credit (Invoice Item Type in S

  • Issues after thunderstorms

    Every time we have a thunderstorm we have major issues with our airport extreme. We have to reset it to the factory settings and start over, and sometimes that doesn't work the first time. For example it went out last night. I got it back up and runn