BW Accelerator Tracing Best Practices
Hi Everyone,
Can u please help me with any documentation link for Best practices in BW Accelerator Tracing. Any document will be helpful. Thanks in advance.
Edited by: priya_lo on Nov 15, 2011 6:53 PM
See if this article helps;
http://www.sdn.sap.com/irj/scn/go/portal/prtroot/docs/library/uuid/e05a68a8-83d6-2c10-0b8e-85823f352eea
Regards,
Gaurav
Similar Messages
-
Using WebI with SAP BW Data - Best practice for introducing BW Accelerator
Hi,
We have a significant investment in using BOE XI 3.1 SP2 integrated to SAP BW 7.0 EHP 1 SPS05.
Now we intend to introduce BW Accelerator to improve the data fetch performance for the Adhoc (WebI) Analysis and the formatted reports built using WebI (Infoview).
Data handling in question is approx. 2 Million+ records for each WebI report / adhoc analysis (20 to 30 columns).
The solution could be BW Cubes --> BW Accelerator --> BW Queries --> BO Universe --> WebI using Infoview
Does it really help in introducing the BW Accelerator like the case described above ?
Understand that the BW Accelerator could improve the performance of the underlying data and hence the BW Queries do work faster; but does it really improve (9x to 10x) performance for the MDX Queries generated by the BO Universe ( BOE XI 3.1 SP2 ) & WebI.
What is the roadmap for the future wrt BW Accelerator and SAP BI BO Integration; if we intend to use WebI ?
Or should be migrate to BO Explorer as the front end for Adhoc Analysis ?
Is BO Explorer able to present 1 Million + records with 20-30 columns ?
What is the best practice / better on performance; as an integrated product / solution ?
1) BW Cubes --> BW Accelerator --> BW Queries --> SAP Integ Kit --> BO Universe --> WebI
2) BW Cubes --> BW Accelerator --> ??? --> BO Explorer --> ??? --> WebI ???
3) BW Cubes --> BW Accelerator --> ??? --> BO Pioneer --> ??? --> WebI ???
4) BW Cubes --> BW Accelerator --> ??? --> BO Explorer
5) BW Cubes --> BW Accelerator --> ??? --> BO Pioneer
6) BW Cubes --> BW Accelerator --> BW Queries --> SAP Integ Kit --> Crystal Reports (to handle above data volume)
7) BW Multiproviders --> BW Accelerator --> BW Queries --> SAP Web Analyzer (to handle above data volume)
regards,
Rajesh K Sarin
Edited by: Rajesh Sarin on Jan 25, 2010 4:05 PMHi,
We have a mix of Adhoc Analysis (60 %) and Formatted Reports (40%). We selected WebI as the tool for the purpose & used it for requirements which process approx. 2M records. We faced bottleneck issues on performance (we are on BO XI 3.1 SP2 & SAP BW 7.0 EHP1, SP05).
We are further analyzing possibility to introduce BWA; if this can handle similar record processing & still preserve our investment on OLAP Universes, WebI, SAP Integration Kit & training users on WebI frontend.
I see a lot of documentation suggesting "BO Explorer and BWA" - understand that BWA would improve the DB time and BO Explorer would help on the Front-end / OLAP time.
Request your guidance on the road map & continuation of investment using BWA + WebI.
regards,
Rajesh K Sarin -
ASA 5505 Best Practice Guidance Requested
I am hoping to tap into the vast wealth of knowledge on this board in order to gain some "best practice" guidance to assist me with the overall setup using the ASA 5505 for a small business client. I'm fairly new to the ASA 5505 so any help would be most appreciated!
My current client configuration is as follows:
a) business internet service (cable) with a fixed IP address
b) a Netgear N600 Wireless Dual Band router (currently setup as gateway and used for internet/WiFi access)
c) a Cisco SG-500-28 switch
d) one server running Windows Small Business Server 2011 Standard (primary Domain Controller)
(This server is currently the DNS and DHCP server)
e) one server running Windows Server 2008 R2 (secondary Domain Controller)
f) approximately eight Windows 7 clients (connected via SG-500-28 switch)
g) approximately six printers connected via internal network (connected via SG-500-28 switch)
All the servers, clients, and printers are connected to the SG-500-28 switch.
The ISP provides the cable modem for the internet service.
The physical cable for internet is connected to the cable modem.
From the cable modem, a CAT 6 ethernet cable is connected to the internet (WAN) port of the Netgear N600 router.
A Cat 6 ethernet cable is connected from Port 1 of the local ethernet (LAN) port on the N600 router to the SG-500-28 switch.
cable modem -> WAN router port
LAN router port -> SG-500-28
The ASA 5505 will be setup with an "LAN" (inside) interface and a "WAN" (outside) interface. Port e0/0 on the ASA 5505 will be used for the outside interface and the remaining ports will be used for the inside interface.
So my basic question is, given the information above of our setup, where should the ASA 5505 be "inserted" to maximize its performance? Also, based on the answer to the previous question, can you provide some insight as to how the ethernet cables should be connected to achieve this?
Another concern I have is what device will be used as the default gateway. Currently, the Netgear N600 is set as the default gateway on both Windows servers. In your recommended best practice solution, does the ASA 5505 become the default gateway or does the router remain the default gateway?
And my final area of concern is with DHCP. As I stated earlier, I am running DHCP on Windows Small Business Server 2011 Standard. Most of the examples I have studied for the ASA 5505 utilize its DHCP functionality. I also have done some research on the "dhcprelay server" command. So I'm not quite sure which is the best way to go. First off, does the "dhcprelay server" even work with SBS 2011? And secondly, if it does work, is the best practice to use the "dhcprelay" command or to let the ASA 5505 perform the DHCP server role?
All input/guidance/suggestions with these issues would be greatly appreciated! I want to implement the ASA 5505 firewall solution following "best practices" recommendations in order to maximize its functionality and minimize the time to implement.
FYI, the information (from the "show version" command) for the ASA 5505 is shown below:
Cisco Adaptive Security Appliance Software Version 8.4(7)
Device Manager Version 7.1(5)100
Compiled on Fri 30-Aug-13 19:48 by builders
System image file is "disk0:/asa847-k8.bin"
Config file at boot was "startup-config"
ciscoasa up 2 days 9 hours
Hardware: ASA5505, 512 MB RAM, CPU Geode 500 MHz
Internal ATA Compact Flash, 128MB
BIOS Flash M50FW016 @ 0xfff00000, 2048KB
Encryption hardware device : Cisco ASA-5505 on-board accelerator (revision 0x0)
Boot microcode : CN1000-MC-BOOT-2.00
SSL/IKE microcode: CNLite-MC-SSLm-PLUS-2.03
IPSec microcode : CNlite-MC-IPSECm-MAIN-2.06
Number of accelerators: 1
0: Int: Internal-Data0/0 : address is a493.4c99.8c0b, irq 11
1: Ext: Ethernet0/0 : address is a493.4c99.8c03, irq 255
2: Ext: Ethernet0/1 : address is a493.4c99.8c04, irq 255
3: Ext: Ethernet0/2 : address is a493.4c99.8c05, irq 255
4: Ext: Ethernet0/3 : address is a493.4c99.8c06, irq 255
5: Ext: Ethernet0/4 : address is a493.4c99.8c07, irq 255
6: Ext: Ethernet0/5 : address is a493.4c99.8c08, irq 255
7: Ext: Ethernet0/6 : address is a493.4c99.8c09, irq 255
8: Ext: Ethernet0/7 : address is a493.4c99.8c0a, irq 255
9: Int: Internal-Data0/1 : address is 0000.0003.0002, irq 255
10: Int: Not used : irq 255
11: Int: Not used : irq 255
Licensed features for this platform:
Maximum Physical Interfaces : 8 perpetual
VLANs : 3 DMZ Restricted
Dual ISPs : Disabled perpetual
VLAN Trunk Ports : 0 perpetual
Inside Hosts : 10 perpetual
Failover : Disabled perpetual
VPN-DES : Enabled perpetual
VPN-3DES-AES : Enabled perpetual
AnyConnect Premium Peers : 2 perpetual
AnyConnect Essentials : Disabled perpetual
Other VPN Peers : 10 perpetual
Total VPN Peers : 12 perpetual
Shared License : Disabled perpetual
AnyConnect for Mobile : Disabled perpetual
AnyConnect for Cisco VPN Phone : Disabled perpetual
Advanced Endpoint Assessment : Disabled perpetual
UC Phone Proxy Sessions : 2 perpetual
Total UC Proxy Sessions : 2 perpetual
Botnet Traffic Filter : Disabled perpetual
Intercompany Media Engine : Disabled perpetual
This platform has a Base license.Hey Jon,
Again, many thanks for the info!
I guess I left that minor detail out concerning the Guest network. I have a second Netgear router that I am using for Guest netowrk access. It is plugged in to one of the LAN network ports on the first Netgear router.
The second Netgear (Guest) router is setup on a different subnet and I am letting the router hand out IP addresses using DHCP.
Basic setup is the 192.168.1.x is the internal network and 192.168.11.x is the Guest network. As far as the SBS 2011 server, it knows nothing about the Guest network in terms of the DHCP addresses it hands out.
Your assumption about the Guest network is correct, I only want to allow guest access to the internet and no access to anything internal. I like your idea of using the restricted DMZ feature of the ASA for the Guest network. (I don't know how to do it, but I like it!) Perhaps you could share more of your knowledge on this?
One final thing, the (internal) Netgear router setup does provide the option for a separate Guest network, however it all hinges on the router being the DHCP server. This is what led me to the second (Guest) Netgear router because I wanted the (internal) Netgear router NOT to use DHCP. Instead I wanted SBS 2011 to be the DHCP server. That's what led to the idea of a second (Guest) router with DHCP enabled.
The other factor in all this is SBS 2011. Not sure what experience you've had with the Small Business Server OS's but they tend to get a little wonky if some of the server roles are disabled. For instance, this is a small busines with a total of about 20 devices including servers, workstations and printers. Early on I thought, "nah, I don't need this IPv6 stuff," so I found an article on how to disable it and did so. The server performance almost immediately took a nose dive. Rebooting the server went from a 5 minute process to a 20 minute process. And this was after I followed the steps of an MSDN article on disabling IPv6 on SBS 2011! Well, long story short, I enabled IPv6 again and the two preceeding issues cleared right up. So, since SBS 2011 by "default" wants DHCP setup I want to try my best to accomodate it. So, again, your opinion/experiece related to this is a tremendous help!
Thanks! -
Best Practice for Distributed TREX NFS vs cluster file systems
Hi,
We are planning to implement a distributed TREX, using RedHat on X64, but we are wondering which could be the best practice or approach to configure the "file server" used on the TREX distributed environment. The guides mention file server, that seems to be another server connected to a SAN exporting or sharing the file systems required to be mounted in all the TREX systems (Master, Backup and Slaves), but we know that the BI accelerator uses OCFS2 (cluster file systems) to access the storage, in the case of RedHat we have GFS or even OCFS.
Basically we would like to know which is the best practice and how other companies are doing it, for a TREX distributed environment using either network file systems or cluster file systems.
Thanks in advance,
ZarehI would like to add one more thing, in my previous comment I assumed that it is possible to use cluster file system on TREX because BI accelerator, but maybe that is not supported, it does not seem to be clear on the TREX guides.
That should be the initial question:
Aare cluster file system solutions supported on plain TREX implementation?
Thanks again,
Zareh -
SAP PI conceptual best practice for synchronous scenarios
Hi,
<br /><br />Apologies for the length of this post but I'm sure this is an area most of you have thought about in your journey with SAP PI.
<br /><br />We have recently upgraded our SAP PI system from 7.0 to 7.1 and I'd like to document best practice guidelines for our internal development team to follow.
I'd be grateful for any feedback related to my thoughts below which may help to consolidate my knowledge to date.
<br /><br />Prior to the upgrade we have implemented a number of synchronous and asynchronous scenarios using SAP PI as the hub at runtime using the Integration Directory configuration.
No interfaces to date are exposes directly from our backend systems using transaction SOAMANAGER.
<br /><br />Our asynchronous scenarios operate through the SAP PI hub at runtime which builds in resilience and harnesses the benefits of the queue-based approach.
<br /><br />My queries relate to the implementation of synchronous scenarios where there is no mapping or routing requirement. Perhaps it's best that I outline my experience/thoughts on the 3 options and summarise my queries/concerns that people may be able to advise upon afterwards.
<br /><br />1) Use SAP PI Integration Directory. I appreciate going through SAP PI at runtime is not necessary and adds latency to the process but the monitoring capability in transaction SXMB_MONI provide full access for audit purposes and we have implemented alerting running hourly so all process errors are raised and we handle accordingly. In our SAP PI Production system we have a full record of sync messages recorded while these don't show in the backend system as we don't have propogation turned on. When we first looked at this, the reduction in speed seemed to be outweighed by the quality of the monitoring/alerting given none of the processes are particularly intensive and don't require instant responses. We have some inbound interfaces called by two sender systems so we have the overhead of maintaing the Integration Repository/Directory design/configuration twice for these systems but the nice thing is SXMB_MONI shows which system sent the message. Extra work but seemingly for improved visibility of the process. I'm not suggesting this is the correct long term approach but states where we are currently.
<br /><br />2) Use the Advanced Adapter Engine. I've heard mixed reviews about this functionaslity, there areh obvious improvements in speed by avoiding the ABAP stack on the SAP PI server at runtime, but some people have complained about the lack of SXMB_MONI support. I don't know if this is still the case as we're at SAP PI 7.1 EHP1 but I plan to test and evaluate once Basis have set up the pre-requisite RFC etc.
<br /><br />3) Use the backend system's SOAP runtime and SOAMANAGER. Using this option I can still model inbound interfaces in SAP PI but expose these using transaction SOAMANAGER in the backend ABAP system. [I would have tested out the direct P2P connection option but our backend systems are still at Netweaver 7.0 and this option is not supported until 7.1 so that's out for now.] The clear benefits of exposing the service directly from the backend system is obviously performance which in some of our planned processes would be desirable. My understanding is that the logging/tracing options in SOAMANAGER have to be switched on while you investigate so there is no automatic recording of interface detail for retrospective review.
<br /><br />Queries:
<br /><br />I have the feeling that there is no clear cut answer to which of the options you select from above but the decision should be based upon the requirements.
<br /><br />I'm curious to understand SAPs intention with these options -
<br /><br />- For synchronous scenarios is it assumed that the client should always handle errors therefore the lack of monitoring should be less of a concern and option 3 desirable when no mapping/routing is required?
<br /><br />- Not only does option 3 offer the best performance, but the generated WSDL is ready once built for any further system to implement thereby offering the maximum benefit of SOA, therefore should we always use option 3 whenever possible?
<br /><br />- Is it intended that the AAE runtime should be used when available but only for asynchronous scenarios or those requiring SAP PI functionality like mapping/routing otherwise customers should use option 3? I accept there are some areas of functionality not yet supported with the AAE so that would be another factor.
<br /><br />Thanks for any advice, it is much appreciated.
<br /><br />Alan
Edited by: Alan Cecchini on Aug 19, 2010 11:48 AM
Edited by: Alan Cecchini on Aug 19, 2010 11:50 AM
Edited by: Alan Cecchini on Aug 20, 2010 12:11 PMHi Aaron,
I was hoping for a better more concrete answer to my questions.
I've had discussion with a number of experienced SAP developers and read many articles.
There is no definitive paper that sets out the best approach here but I have gleaned the following key points:
- Make interfaces asynchronous whenever possible to reduce system dependencies and improve the user experience (e.g. by eliminating wait times when they are not essential, such as by sending them an email with confirmation details rather than waiting for the server to respond)
- It is the responsibility of the client to handle errors in synchronous scenarios hence monitoring lost through P-P services compared to the details information in transaction SXMB_MONI for PI services is not such a big issue. You can always turn on monitoring in SOAMANAGER to trace errors if need be.
- Choice of integration technique varies considerably by release level (for PI and Netweaver) so system landscape will be a significant factor. For example, we have some systems on Netweaver 7.0 and other on 7.1. As you need 7.1 for direction connection PI services we'd rather wait until all systems are at the higher level than have mixed usage in our landscape - it is already complex enough.
- We've not tried the AAE option in a Production scenarios yet but this is only really important for high volume interfaces, something that is not a concern at the moment. Obviously cumulative performance may be an issue in time so we plan to start looking at AAE soon.
Hope these comments may be useful.
Alan -
Dear All,
I am in need of some u201Cbest practicesu201D in asset management in Telecommunications industry.
On of my LE clients would like to implement asset management. The concentration will be PM - equipment tracing & tracking in the system. A u201Cbest-practiceu201D asset management system in their SAP ECC is the idea.
An insight into best practices of Plant Maintenance and equipment trace & tracking in Telecommunications Industry needed.
u2022 Asset coding/naming design in Telecommunications industry (especially in u201Cnetwork assetsu201D u2013 if there is some kind of a best practice how to name the assets (hierarchies, naming conventions etc)).
u2022 Insight into plant maintenanceu2019s core functionality in Telco
u2022 Any tracing & tracking system proposal u2013 barcode and/or RFID technologies for telecommunication asset management u2013 any insight into partners working for this purpose.
They are specifically interested in Deutsche Telekom
Best regards
Yavuz Durgut - SAP TurkeyHi,
You have a good start. What you need to do is 1.) Find out what the requirements are -- what does your user want.... if this is a fact finding mission (e.g., they want to see what's in the system) then your requirement becomes load the data in R/3, so figure out what they configured in PM and use those definitions as your requirements. 2.) Use those requirements to find data in the fields listed in the MultiProvider, InfoCubes, or DataStore Objects sections in your first link ... in other words, now that you know what data to look for, look for it in the Data Targets (MP's, Cubes and DSO's). If you find some of the data you want, then trace back the infosources and determine what datasources in R/3 load the data you are looking for. After all that, check those datasources for any additional fields you may need and add them in.
So, if your company doesn't maintain equipment costs or maintenance costs for equipment, then you don't have to worry about 0PM_MP04. Use this type of logic to whittle down to what your really need and want, then activate those object only.
Good Luck,
Brian -
Best Practice for sugar refinery process
hello, my company need to deploy a new business concerning raw sugar refinery.
so we need to analyze the business requirements and propose a process for refinery management .
step 1: arrival of goods in docks
step 2: raw sugar need to be charged in our stock ( quantity and value ) but is not our property
step 3: goods need to be delivered to our plant ( we pay the transport as service for our business partner )
step 4: goods need to be verified ( for quality and quantity ) and accepted by operators
step 5: goods are processed in a refinery plant, we need to verify timing, costs, quantity and human resources employed ( for costs remittance and transfer )
step 6: sugar is delivered to other industrial plants, warehouse and finally sold ( but is not our property ), for us it's like a refinery service.
step 7: we need to trace production lot from raw sugar arrival to the docks up to step 6 .
step 8: inventory and maintenance costs need to be traced because our profit is a part of this refinery service reduced by costs incurred
any suggestions to find the right best practice ?
I'm not a skilled BPS, I was looking for oil refinery but is not the same process, so.. what can i look for?
TNksHi Kumar,
In order to have Consigment in SAP u need to have master data such as material master, vendor master and purchase inforecord of consignment type. U have to enter the item category k when u enter PO. The goods receipt post in vendor consignment stock will be non valuated.
1. The intial steps starts with raising purchase order for the consignment item
2. The vendor recieves the purchase order.
3. GR happens for the consignment material.
4. Stocks are recieved and placed under consignment stock.
5. When ever we issue to prodn or if we transfer post(using mov 411) from consignment to own stock then liability occurs.
6. Finally comes the settlement using mrko. You settle the amount for the goods which was consumed during a specific period.
regards
Anand.C -
Upcoming SAP Best Practices Data Migration Training - Chicago
YOU ARE INVITED TO ATTEND HANDS-ON TRAINING
SAP America, Downers Grove in Chicago, IL:
November 3 u2013 5, 2010 `
Installation and Deployment of SAP Best Practices for Data Migration & SAP BusinessObjects Data Services
Install and learn how to use the latest SAP Best Practices for Data Migration package. This new package combines the familiar IDoc technology together with the SAP BusinessObjects (SBOP) Data Services to load your customeru2019s legacy data to SAP ERP and SAP CRM (New!).
Agenda
At the end of this unique hands-on session, participants will depart with the SBOP Data Services and SAP Best Practices for Data Migration installed on their own laptops. The three-day training course will cover all aspects of the data migration package including:
1. Offering Overview u2013 Introduction to the new SAP Best Practices for Data Migration package and data migration content designed for SAP BAiO / SAP ERP and SAP CRM
2. Data Services fundamentals u2013 Architecture, source and target metadata definition. Process of creating batch Jobs, validating, tracing, debugging, and data assessment.
3. Installation and configuration of the SBOP Data Servicesu2013 Installation and deployment of the Data Services and content from SAP Best Practices. Configuration of your target SAP environment and deploying the Migration Services application.
4. Customer Master example u2013 Demonstrations and hands-on exercises on migrating an object from a legacy source application through to the target SAP application.
5. Overview of Data Quality within the Data Migration process A demonstration of the Data Quality functionality available to partners using the full Data Services toolset as an extension to the Data Services license.
Logistics & How to Register
Nov. 3 u2013 5: SAP America, Downers Grove, IL
Wednesday 10AM u2013 5PM
Thursday 9AM u2013 5PM
Friday 8AM u2013 3PM
Address:
SAP America u2013Buckingham Room
3010 Highland Parkway
Downers Grove, IL USA 60515
Partner Requirements: All participants must bring their own laptop to install SAP Business Objects Data Services on it. Please see attached laptop specifications and ensure your laptop meets these requirements.
Cost: Partner registration is free of charge
Who should attend: Partner team members responsible for customer data migration activities, or for delivery of implementation tools for SAP Business All-in-One solutions. Ideal candidates are:
u2022 Data Migration consultant and IDoc experts involved in data migration and integration projects
u2022 Functional experts that perform mapping activities for data migration
u2022 ABAP developers who write load programs for data migration
Trainers
Oren Shatil u2013 SAP Business All-in-One Development
Frank Densborn u2013 SAP Business All-in-One Development
To register please use the hyperlink below.
http://service.sap.com/~sapidb/011000358700000917382010EHello,
The link does not work. This training is still available ?
Regards,
Romuald -
Upcoming SAP Best Practices Data Migration Training - Berlin
YOU ARE INVITED TO ATTEND HANDS-ON TRAINING
Berlin, Germany: October 06 u2013 08, 2010 `
Installation and Deployment of SAP Best Practices for Data Migration & SAP BusinessObjects Data Integrator
Install and learn how to use the latest SAP Best Practices for Data Migration package. This new package combines the familiar IDoc technology together with the SAP BusinessObjects (SBOP) Data Integrator to load your customeru2019s legacy data to SAP ERP and SAP CRM (New!).
Agenda
At the end of this unique hands-on session, participants will depart with the SBOP Data Integrator and SAP Best Practices for Data Migration installed on their own laptops. The three-day training course will cover all aspects of the data migration package including:
1. Offering Overview u2013 Introduction to the new SAP Best Practices for Data Migration package and data migration content designed for SAP BAiO / SAP ERP and SAP CRM
2. Data Integrator fundamentals u2013 Architecture, source and target metadata definition. Process of creating batch Jobs, validating, tracing, debugging, and data assessment.
3. Installation and configuration of the SBOP Data Integratoru2013 Installation and deployment of the Data Integrator and content from SAP Best Practices. Configuration of your target SAP environment and deploying the Migration Services application.
4. Customer Master example u2013 Demonstrations and hands-on exercises on migrating an object from a legacy source application through to the target SAP application.
Logistics & How to Register
October 06 u2013 08: Berlin, Germany
Wednesday 10AM u2013 5PM
Thursday 9AM u2013 5PM
Friday 9AM u2013 4PM
SAP Deutschland AG & Co. KG
Rosenthaler Strasse 30
D-10178 Berlin, Germany
Training room S5 (1st floor)
Partner Requirements: All participants must bring their own laptop to install SAP Business Objects Data Integrator on it. Please see attached laptop specifications and ensure your laptop meets these requirements.
Cost: Partner registration is free of charge
Who should attend: Partner team members responsible for customer data migration activities, or for delivery of implementation tools for SAP Business All-in-One solutions. Ideal candidates are:
u2022 Data Migration consultant and IDoc experts involved in data migration and integration projects
u2022 Functional experts that perform mapping activities for data migration
u2022 ABAP developers who write load programs for data migration
Trainers
Oren Shatil u2013 SAP Business All-in-One Development
Frank Densborn u2013 SAP Business All-in-One Development
To register please follow the hyperlink below
http://intranet.sap.com/~sapidb/011000358700000940832010EHello,
The link does not work. This training is still available ?
Regards,
Romuald -
Must use Captivate v4.0 to capture playing video - Best practices & PC requirements?
Hi all,
I like Captivate, but have had a lot of trouble capturing playing video (a big part of my client's product) in the past. Despite encouraging client to consider other tools, they have decided they want to continue using Captivate and output to SWF. Some of us on the team for a previous project using v3.0 were able to successfully capture moving video, others weren't. (Vaguely recall something to do with a Hardware Acceleration setting, but it didn't work for everyone.) We've all upgraded to v4.0 but haven't used it yet. My questions are:
1. Has this been improved at all in v4.0?
2. What is the "optimum" PC hardware for doing this? (I am purchasing a new desktop, and need to keep cost to a minimum, BUT that is secondary to being able to produce these demos with moving video captured well!)
3. Can you give me any best practice tips specific to capturing moving video? FYI, we will be adding audio narration (after capture), but will not be using written captions except for a few Notes/Tips.
Thanks in advance for your help!
Katie
Senior Technical Writer
Phoenix, AZHi all,
I like Captivate, but have had a lot of trouble capturing playing video (a big part of my client's product) in the past. Despite encouraging client to consider other tools, they have decided they want to continue using Captivate and output to SWF. Some of us on the team for a previous project using v3.0 were able to successfully capture moving video, others weren't. (Vaguely recall something to do with a Hardware Acceleration setting, but it didn't work for everyone.) We've all upgraded to v4.0 but haven't used it yet. My questions are:
1. Has this been improved at all in v4.0?
2. What is the "optimum" PC hardware for doing this? (I am purchasing a new desktop, and need to keep cost to a minimum, BUT that is secondary to being able to produce these demos with moving video captured well!)
3. Can you give me any best practice tips specific to capturing moving video? FYI, we will be adding audio narration (after capture), but will not be using written captions except for a few Notes/Tips.
Thanks in advance for your help!
Katie
Senior Technical Writer
Phoenix, AZ -
Authorization Scheme -- Best Practices?
Hi All --
We have a reporting application containing approximately 300 pages and 60 or so menu items all using authorization schemes (exists SQL method) as a means to determine whether or not a use can see the menu items and/or access the pages. We've been seeing an issue where a user logging into the application experiences poor performance upon login and have traced it to our access checks and the number of "exists" queries run when a user logs in and before our menu is displayed.
What would be considered best practice in a case such as this? Does anyone have any ideas on how to increase the performance on these authorizaton checks?
Thanks,
Leigh Johnson
Fastenal CompanyLeigh - No, the asktom post Joel referred to is posted above: http://asktom.oracle.com/pls/asktom/f?p=100:11:0::::P11_QUESTION_ID:62048567543425
We just want to know if this post if from you folks or not.
About the authorization schemes for each page, I would think that whatever scheme you code to authorize a link to a page, e.g., on a menu, would be the same scheme you'd want to attach to the page itself.
So the authorization has to take place first at the point you render (or suppress) a link to a page and again at the point the page is requested (the latter being necessary because a user can bypass the menu links and try to access pages directly by entering the page ID in the URL.
So again, if you have X links on the menu page, each requiring a distinct query for authorization, you'll have to pay the price to do all that authorization once per session because of the design of the menu page. More precisely, the authorization scheme code, e.g., their EXISTS queries, have to be executed once per session per resource access attempted. For performance purposes, the results of these checks are cached for the duration of the session (because you set them up to be evaluated once per session and not on every page view).
One thing that might help you is region caching (or page caching) for the menu. You'd use the Cache By User option, of course. Then if the same named user logged in and out numerous times during the "cache valid" period, which is adjustable, the user would see the cached menu "instantly". Authorization checks will not have been performed during these page requests however, so you'd want to be sure that it makes sense to present cached versions of these links. However, the corresponding authorization schemes that you'd attach to the pages themselves would be evaluated when the user clicked on a "cached" link, so you'll get the protection you need, ultimately.
Scott -
Best Practice for new BI project
we are about to start new BI project.
what is the best practice to start new BI project
1. start only with FI as poc
2. more than one module does not matter if the project was planned and scoped well
can you give me link to best practice in starting BI projects?
RegardsHi,
Please go through following links.
Business Intelligence : Getting Started with Business Intelligence ,Reporting, Analysis, and Planning , Data Warehousing, BI Accelerator Embedded and Process-Centric BI
https://www.sdn.sap.com/irj/sdn/nw-bi
The Home of SAP Business Warehouse (BW)
http://www.erpgenie.com/sap/sapfunc/bw.htm
SAP Business Information Warehouse
http://help.sap.com/saphelp_nw04/helpdata/en/b2/e50138fede083de10000009b38f8cf/content.htm
Business Intelligence : Programming in BW
https://www.sdn.sap.com/irj/sdn/wiki?path=/display/bi/programminginBW&
BW and Portals 2005
https://www.sdn.sap.com/irj/sdn/bi-and-portals2005
SAP Business Warehouse (BW) Overview
http://gleez.com/sap/bw/overview
Business Intelligence : Steps to get started with SAP BW
https://www.sdn.sap.com/irj/sdn/wiki?path=/display/bi/stepstogetstartedwithSAPBW&
SAP Business Information Warehouse Scenarios
http://help.sap.com/bp_biv335/BI_EN/html/Bw.htm
SAP BW Learning Guide
http://searchsap.techtarget.com/general/0,295582,sid21_gci1077480,00.html
SAP BW Business Warehouse - Introduction
http://www.thespot4sap.com/Articles/SAP_BW_Introduction.asp
Business Content Frontend Design Guidelines (NW04)
https://www.sdn.sap.com/irj/sdn/go/portal/prtroot/docs/library/uuid/bda556e9-0c01-0010-83b0-d519d6deb9e9
How To Create and Maintain UI Patterns of BI Content
https://www.sdn.sap.com/irj/sdn/go/portal/prtroot/docs/library/uuid/ad214fe9-0c01-0010-4291-a629e5ba5f90
SAP NetWeaver BI Integrated Planning for Finance
http://www.sap-press.de/katalog/buecher/htmlleseproben/gp/htmlprobID-113
Info object,infocube,infosource,datasource,commn structure,extract structure ..etc..
http://www.erpgenie.com/sapgenie/docs/MySAP%20BW%20Cookbook%20Vol%201.pdf
https://www.sdn.sap.com/irj/sdn/go/portal/prtroot/docs/library/uuid/01b9395c-0e01-0010-6786-c4ee5e5d2154
BI Performance Tuning
FAQ - The Future of SAP NetWeaver Business Intelligence in the Light of the NetWeaver BI&Business Objects Roadmap
Business Intelligence Journal Improving Query Performance in Data Warehouses
http://www.tdwi.org/Publications/BIJournal/display.aspx?ID=7891
Achieving BI Query Performance Building Business Intelligence
http://www.dmreview.com/issues/20051001/1038109-1.html
SAP Business Intelligence Accelerator : A High - Performance Analytic Engine for SAP Ne tWeaver Business Intelligence
http://www.sap.com/platform/netweaver/pdf/BWP_AR_IDC_BI_Accelerator.pdf
BI Performance Audit
http://www.xtivia.com/downloads/Xtivia_BIT_Performance%20Audit.pdf
https://www.sdn.sap.com/irj/sdn/go/portal/prtroot/docs/library/uuid/10564d5c-cf00-2a10-7b87-c94e38267742
Enhancements in SAP BW
https://www.sdn.sap.com/irj/sdn/go/portal/prtroot/docs/library/uuid/59069d90-0201-0010-fd81-d5e11994d8b5
Customer Enhancements in SAP NetWeaver BI (Exits, BAdIs and ABAP in the SAP NetWeaver BI Back End)
http://www.sap.com/community/pub/showdetail.epx?itemID=5257
BW 3.x and 3.5 "How To" Guides List
https://www.sdn.sap.com/irj/sdn/docs?rid=/webcontent/uuid/ee14e25d-0501-0010-11ad-8eb2861a7ec0 [original link is broken]
Assign ponits if it helps
Thanks & Regards
santo -
Hi, all.
Over the weekend, we applied the SPS17 to the ECC6.0 server running on dual stack. We also updated the HCM EHP3 to stack 6.
We have a lot of WD for ABAP and WD for JAVA applications running on the ECC dual stack server. The contents are federated to the consumer portal running on EP7.0 SPS21. Note the consumer was NOT patched during the weekend.
On Monday morning, we get many calls from users that their HCM apps are not working on the consumer portal. The error can come in many different ways. The fix so far is to clear their IE cache and everything works again. Note that the problem doesn't happen to everybody, less than 10% of the user population. But the 10% is enough to flood our helpdesk with calls.
I am not sure if any of you has run into this problem before. Is that a best practice to delete the IE cache from all the users after an SP upgrade? Any idea to see what caused the error?
Thanks,
Jonathan.Hi Jonathan,
I have encountered a similar situation before but have unfortunately never got to the root cause of it. One thing I did notice was that browser versions tended to affect how the cache was handled for local users. We noticed that IE7 handled changes in the WDA apps much better than certain versions of IE6. Not sure if this is relevant in your scenario.
I assume also that you are not using ACCAD or other WAN acceleration devices (as these have their own cache that can break on upgrades) and that you've cleared out your portal caches for good measure. As far as I know in ITS, if you've stopped and started the WDA services during the upgrade then the caching shouldn't be a problem.
Cheers,
E -
Best Practices for CS6 - Multi-instance (setup, deployment and LBQ)
Hi everyone,
We recently upgraded from CS5.5 to CS6 and migrated to a multi-instance server from a single-instance. Our current applications are .NET-based (C#, MVC) and are using SOAP to connect to the InDesign server. All in all it is working quite well.
Now that we have CS6 (multi-instance) we are looking at migrating our applications to use the LBQ features to help balance the workload on the INDS server(s). Where can I find some best practices for code deployment/configuration, etc for a .NET-based platform to talk to InDesign?
We will be using the LBQ to help with load management for sure.
Thanks for any thoughts and direction you can point me to.
~AllenPlease see if below metalink note guides you:-
Symmetrical Network Acceleration with Oracle E-Business Suite Release 12 [ID 967992.1]
Thanks,
JD -
Best practices for logging results from Looped steps
Hi all
I would like to start a discussion to document best practices for logging results (to reports and databases) from Looped Steps
As an application example - let's say you are developing a test for one of NI's analog input or output cards and need to measure a voltage across multiple inputs or outputs.
One way to do that would be to create a sequence that switches the appropriate signals and performs a "Voltage Measurement" test in a loop.
What are your techniques for keeping track of the individual measurements so that they can be traced to the individual signal paths that are being measured?
I have used a variety of techniques such as
i )creating a custom step type that generates unique identifiers for each iteration of the loop. This required some customization to the results processing . Also the sequence developer had to include code to ensure that a unique identifier was generated for each iteration
ii) Adding an input parameter to the test function/vi, passing loop iteration to it and adding this to Additional results parameters to log.I have attached a simple example (LV 2012 and TS 2012) that includes steps inside a loop structure as well as a looped test.
If you enable both database and report generation, you will see the following:
1) The numeric limit test in the for loop always generates the same name in the report and database which makes it difficult to determine the result of a particular iteration
2) The Max voltage test report includes the paramater as an additional result but the database does not include any differentiating information
3) The Looped Limit test generates both uniques reports and database entries - you can easily see what the result for each iteration is.
As mentioned, I am seeking to start a discussion for how others handle results for steps inside loops. The only way I have been able to accomplish a result similar to that of the Looped step (unique results and database entry for each iteration of the loop) is to modify the process model results processing.
Attachments:
test.vi 27 KB
Sequence File 2.seq 9 KB
Maybe you are looking for
-
Hi, I'm new to java and do not understand why when I run this program with the argument * I get a list of files in my current working directory. Can someone explain this? class test{ public static void main(String args[]){ for (int i = 0;i < args.len
-
I am using HP Laptop with window 8.1. The right click button on the keyboard is not available so till the time I was using mouse for the right click. Now I install Hotkeys software & configure right Alt button of the keyboard as a right click button.
-
I thought I would have a 'normal' mode for the clone tool - it is available with the Pattern Clone Tool, but not for the 'plain' clone tool. Is this correct? I thought it would have a normal mode?
-
When I downloaded the Mavericks Operating system to my MAC computer, the mac stopped recognizing my T4i camera. How do I fix this.
-
CC2014 crashing upon opening apps
Hi there; I've got a 2012 2 x 2.93 GHz 6-Core Intel Xeon Mac Pro, running OS 10.7.5, and after updating all the Cloud apps to CC2014, they all crash upon launch. Any help here? Got a HUGE project about to begin using Premiere & AfterEffects-.