Move of Repository and Audit Databases - Du00E9placement des bases de Ru00E9ferentiel et d'Audit
Post Author: Karine
CA Forum: Administration
Bonjour,
Pourriez-vous m'indiquez la procédure à suivre pour le cas suivant.La base de données Oracle de réferenciel BO et la base d'Audit, sont actuellement sur un serveur A.Pour diverses raisons elles devront migrer sur un autre serveur B, dont le nom et l'adresse IP sont differents.Pouvez-vous me donner les étapes à effectuer et dans quel ordre.Merci.
Bonne journée,Karine.
Hi,
Could you tell me the process to follow for the following case:The Audit and Repository Oracle databsses are located on the A serveur.For some raisons, the databases have to migrate on a B serveur, wich IP and name different.Could you give me all the steps to do and in the good order ?Thank you.
Have nice day,Karine.
Post Author: jsanzone
CA Forum: Administration
Karine,
You will have to down your BO services. Then on the BO server modify the ODBC connection from pointing from its current target (the BO server presumably), to the new server (hopefully account names and passwords have not changed, but if they have changed, then you'll have to take care of that detail as well). Once the ODBC connection is corrected, then you should be able to bring up the BO services again.
Similar Messages
-
Informatica Repository and Target database
I have a scenario in which the repository database is DB1 and target database is DB2. How is that possible? I mean isnt the target database stored in the repository? Is it still alright?
Hi,
Informatica repository tables generally referred as OPB tables and which has all the data related to your mappings and by using these tables informatica will refer the mappings.
Thanks,
Navin Kumar Bolla -
OWB 10gR1 Repository and 10gR2 Database
Hi,
we have a problem in installing OWB Design Repository 10gR1 on 10gR2 Database. The Repository assistant hangs with message "INS0003 The User should have DBA privileges to complete the installation". I'm connected as SYS....
Regards,
Hakon.Her is a workaround I get from Oracle, may it works. Till this time I can't test it, try it on your own risk.
Regards
Norbert
>>
Using the 050621 shiphome release of 10gR2 on Linux we have noticed the INTERACTIONEXECUTE stored procedure is defined with definer rights - it should be invoker rights.
Using the stored procedure to create an AW will create the AW in the SYS schema.
The only workaround is to redefine the stored procedure in the SYS schema with invoker rights.
CREATE OR REPLACE function interactionexecute(input CLOB) return varchar2 AUTHID CURRENT_USER
as language java name 'oracle.AWAction.Interaction.Execute(oracle.sql.CLOB)
return java.lang.String';
<<<< -
Changing CMS and Audit Repository databases from Oracle to SQl server 2008
Hi guys,
We have a Business Objects Dev environment which was installed with Oracle 10g database for CMS and Audit Repository.
Our database team now decided to change the CMS and Audit databases of Dev BOE from Oracle to SQL server 2008.
What is the ideal way to achieve this? I'm concerned because the old DB is Oracle and the new one would be SQL server.
Earlier, I have changed CMS database from one to another by stopping SIA , taking the backup of old DB into new and changing it in the Update Databse option. But in that case both old and new CMS databses were on SQL server 2005.
Thanks,
GangaDenise,
Thanks for the solution.
We have done Windows AD and SAP integration on the Dev BOE. Will there be any issue with those after the DB change. I am guessing there won't be, but just want to confirm. Please reply.
Also, we need to stop the old SIA and start using the new SIA after the step two is done right? -
Connection pooling and auditing on an oracle database
Integration of a weblogic application with an oracle backend,
Connection pooling, and auditing ,2 conflicting requirements ?
Problem statement :
We are in the process of maintaining a legacy client server application where
the client is
written in PowerBuilder and the backend is using an Oracle database.
Almost all business logic is implemented in stored procedures on the database.
When working in client/server mode ,1 PowerBuilder User has a one-to-one relation
with
a connection(session) on the oracle database.
It is a requirement that the database administrator must see the real user connected
to the database
and NOT some kind of superuser, therefore in the PowerBuilder app each user connects
to the database
with his own username.(Each user is configured on the database via a seperate
powerbuilder security app).
For the PowerBuilder app all is fine and this app can maintain conversional state(setting
and
reading of global variables in oracle packages).
The management is pushing for web-based application where we will be using bea
weblogic appserver(J2EE based).
We have build an business app which is web-based and accessing the same oracle
backend app as
the PowerBuilder app is doing.
The first version of this web-based app is using a custom build connector(based
on JCA standard and
derived from a template provided by the weblogic integration installation).
This custom build connector is essentially a combination of a custom realm in
weblogic terms
and a degraded connection pool , where each web session(browser) has a one-to-one
relation
with the back end database.
The reason that this custom connector is combining the security functionality
and the pooling
functionality , is because each user must be authenticated against the oracle
database(security requirement)
and NOT against a LDAP server, and we are using a statefull backend(oracle packages)
which would make it
difficult to reuse connections.
A problem that surfaced while doing heavy loadtesting with the custom connector,
is that sometimes connections are closed and new ones made in the midst of a transaction.
If you imagine a scenario where a session bean creates a business entity ,and
the session bean
calls 1 entity bean for the header and 1 entity bean for the detail, then the
header and detail
must be created in the same transaction AND with the same connection(there is
a parent-child relationship
between header and detail enforced on the back end database via Primary and Foreing
Keys).
We have not yet found why weblogic is closing the connection!
A second problem that we are experincing with the custom connector, is the use
of CMP(container managed persistence)
within entity beans.
The J2EE developers state that the use of CMP decreases the develoment time and
thus also maintenance costs.
We have not yet found a way to integrate a custom connector with the CMP persistence
scheme !
In order to solve our loadtesting and CMP persistence problems i was asked to
come up with a solution
which should not use a custom connector,but use standard connection pools from
weblogic.
To resolve the authentication problem on weblogic i could make a custom realm
which connects to the
backend database with the username and password, and if the connection is ok ,
i could consider this
user as authenticated in weblogic.
That still leaves me with the problem of auditing and pooling.
If i were to use a standard connection pool,then all transaction made in the oracle
database
would be done by a pool user or super user, a solution which will be rejected
by our local security officer,
because you can not see which real user made a transaction in the database.
I could still use the connection pool and in the application , advise the application
developers
to set an oracle package variable with the real user, then on arrival of the request
in the database,
the logic could use this package variable to set the transaction user.
There are still problems with this approach :
- The administrator of the database can still not see who is connected , he will
only see the superuser connection.
- This scheme can not be used when you want to use CMP persistence , since it
is weblogic who will generate the code
to access the database.
I thought i had a solution when oracle provided us with a connection pool known
as OracleOCIConnectionPool
where there is a connection made by a superuser, but where sessions are multiplexed
over this physical pipe with the real user.
I can not seem to properly integrate this OCI connectionpool into weblogic.
When using this pool , and we are coming into a bean (session or entity bean)
weblogic is wrapping
this pool with it's own internal Datasource and giving me back a connection of
the superuser, but not one for the real user,
thus setting me with my back to the wall again.
I would appreciate if anyone had experienced the same problem to share a possible
solution with us
in order to satisfy all requirements(security,auditing,CMP).
Many Thanks
Blyau Gino
[email protected]Hi Blyau,
As Joe has already provided some technical advice,
I'll try to say something on engineering process level.
While migrating an application from one technology to
other, like client-server to n-tier in you case, customers and
stakeholders want to push into the new system as many old
requirements as possible. This approach is AKA "we must
have ALL of the features of the old system". Mostly it happens
because they don't know what they want. Ad little understanding
of abilities of the new technology, and you will get a requirement
like the one you have in you hands.
I think "DBA must see real user" is one of those. For this
type of requirements it can make sense to try to drop it,
or to understand its nature and suggest alternatives. In this
particular case it can be a system that logs user names,
login and logout times.
Blind copying of old features into an incompatible new architecture
may endanger the whole project and can result in its failure.
Hope this helps.
Regards,
Slava Imeshev
"Blyau Gino" <[email protected]> wrote in message
news:[email protected]...
>
Integration of a weblogic application with an oracle backend,
Connection pooling, and auditing ,2 conflicting requirements ?
Problem statement :
We are in the process of maintaining a legacy client server applicationwhere
the client is
written in PowerBuilder and the backend is using an Oracle database.
Almost all business logic is implemented in stored procedures on thedatabase.
When working in client/server mode ,1 PowerBuilder User has a one-to-onerelation
with
a connection(session) on the oracle database.
It is a requirement that the database administrator must see the real userconnected
to the database
and NOT some kind of superuser, therefore in the PowerBuilder app eachuser connects
to the database
with his own username.(Each user is configured on the database via aseperate
powerbuilder security app).
For the PowerBuilder app all is fine and this app can maintainconversional state(setting
and
reading of global variables in oracle packages).
The management is pushing for web-based application where we will be usingbea
weblogic appserver(J2EE based).
We have build an business app which is web-based and accessing the sameoracle
backend app as
the PowerBuilder app is doing.
The first version of this web-based app is using a custom buildconnector(based
on JCA standard and
derived from a template provided by the weblogic integrationinstallation).
This custom build connector is essentially a combination of a custom realmin
weblogic terms
and a degraded connection pool , where each web session(browser) has aone-to-one
relation
with the back end database.
The reason that this custom connector is combining the securityfunctionality
and the pooling
functionality , is because each user must be authenticated against theoracle
database(security requirement)
and NOT against a LDAP server, and we are using a statefull backend(oraclepackages)
which would make it
difficult to reuse connections.
A problem that surfaced while doing heavy loadtesting with the customconnector,
>
is that sometimes connections are closed and new ones made in the midst ofa transaction.
If you imagine a scenario where a session bean creates a business entity,and
the session bean
calls 1 entity bean for the header and 1 entity bean for the detail, thenthe
header and detail
must be created in the same transaction AND with the same connection(thereis
a parent-child relationship
between header and detail enforced on the back end database via Primaryand Foreing
Keys).
We have not yet found why weblogic is closing the connection!
A second problem that we are experincing with the custom connector, is theuse
of CMP(container managed persistence)
within entity beans.
The J2EE developers state that the use of CMP decreases the develomenttime and
thus also maintenance costs.
We have not yet found a way to integrate a custom connector with the CMPpersistence
scheme !
In order to solve our loadtesting and CMP persistence problems i was askedto
come up with a solution
which should not use a custom connector,but use standard connection poolsfrom
weblogic.
To resolve the authentication problem on weblogic i could make a customrealm
which connects to the
backend database with the username and password, and if the connection isok ,
i could consider this
user as authenticated in weblogic.
That still leaves me with the problem of auditing and pooling.
If i were to use a standard connection pool,then all transaction made inthe oracle
database
would be done by a pool user or super user, a solution which will berejected
by our local security officer,
because you can not see which real user made a transaction in thedatabase.
I could still use the connection pool and in the application , advise theapplication
developers
to set an oracle package variable with the real user, then on arrival ofthe request
in the database,
the logic could use this package variable to set the transaction user.
There are still problems with this approach :
- The administrator of the database can still not see who is connected ,he will
only see the superuser connection.
- This scheme can not be used when you want to use CMP persistence , sinceit
is weblogic who will generate the code
to access the database.
I thought i had a solution when oracle provided us with a connection poolknown
as OracleOCIConnectionPool
where there is a connection made by a superuser, but where sessions aremultiplexed
over this physical pipe with the real user.
I can not seem to properly integrate this OCI connectionpool intoweblogic.
When using this pool , and we are coming into a bean (session or entitybean)
weblogic is wrapping
this pool with it's own internal Datasource and giving me back aconnection of
the superuser, but not one for the real user,
thus setting me with my back to the wall again.
I would appreciate if anyone had experienced the same problem to share apossible
solution with us
in order to satisfy all requirements(security,auditing,CMP).
Many Thanks
Blyau Gino
[email protected] -
Can i use Oracle Database Audit Vault and Oracle Database Firewall on Solaris?
Can i use Oracle Database Audit Vault and Oracle Database Firewall on Solaris?
4195bee8-4db0-4799-a674-18f89aa500cb wrote:
i dont have access to My Oracle Support can u send text or html of document please?
Moderator Action:
No they cannot send you a document that is available only to those with access to MOS.
That would violate the conditions of having such service contract credentials.
Asking someone to violate such privileges is a serious offense and could get that other person's organization banned from all support and all their support contracts cancelled.
Your post is locked.
Your duplicate post that you placed into the Audit Vault forum space has been removed (it had no responses).
This thread which you had placed in the Solaris 10 forum space is moved to the Audit Vault forum space.
That's the proper location for Audit Vault questions. -
Oracle Database Firewall and Audit Vault - alert category in HP ArcSight SIEM
HI,
in the new Oracle Database Firewall and Audit Vault 12.1.x there isn't the category "alert" that can be sent to ArcSight SIEM ... there's only for Syslogs
Do you know why?? In th old version (5.1) you could choose alert category for both formats, syslog and arcSight Siem.
Thx
MatteoWell,
In case of someone needs it.
I found something in Note: 105047
https://websmp230.sap-ag.de/sap(bD1wdCZjPTAwMQ==)/bc/bsp/sno/ui_entry/entry.htm?param=69765F6D6F64653D3030312669765F7361… -
DB2 UDB9.1 support as CE10 CMS and Audit database
Does any body know if DB2 UDB 9.1 is supported as CMS and Audit database for Crystal Enterprise (CE) version 10?
I know CE 10 is not supported by BO any more, we are in the process of migrating to BOXI but mean while we have a need to upgrade DB2 UDB from 8.1 to 9.1 version.
Also did any one encounter any issues using CE10 against DB2 UDB 9.1.
Please advise.
ThanksHello Romain,
Thank you very much for your clarifications.
I guess there was a second question hidden behind my first question.
Here is the whole story, we had a workshop yesterday with some Consultants specialized on BI.4 technologies. We were discussing the architecture, and more particularly which would be the most suitable Web Application Server :
* It seems that most SAP Customers choose the bundled WAS : Tomcat 7
* But as SAP Basis Administrators,we had pretty much no previous exposure to Tomcat/Apache Technologies
* The Consultants told us that an increasing number of SAP Customers are choosing SAP NW AS for deployment of BOE.war and other BI webapps
* But so are now considering installing dedicated Netweaver AS on Windows, that will be acting as BI 4.1 Application Servers. We do not plan to deploy BI 4.1 applications on any of our existing NW AS ,as they have been installed on HP-UX ( not supported by BI.4 )
The only problem with that is that you can not install a Netweaver AS, without its corresponding Database, so I was wondering if that New Oracle Database installed only when deploying"NW AS/BI 4 Web application Tier", could be used as the CMS/Audit Database.
Your latest answer seems to suggest that it is definitely not a good idea.
The thing is , that we do not want to end up with two databases per Landscape tier (DEV,QAS,PRD)
I hope it is a bit more clear.
Thank you -
"Compliance and Audit Database domain account is not valid" error
I am attempting to install MBAM 2.5 and on the configure reports page I am getting an error stating "Compliance and Audit Database domain account is not valid" when trying to specify the account. I am using the same account as the read only
access domain user specified on the Configure Databases page.
The account is not locked out and I reset the password but no dice. Any idea what is going on here? This is my second attempt at an install (removed all previous install pieces including the databases).I had the exact same problem. I had to add the account to the local administrators group in order to proceed. The installation then informs you that the account is an administrator at the end.
I removed it from the admins group. Doesn't seem to have caused any problems. Everything is working OK. -
Backingup and Restoring CMS and Auditing Database
We have a senario where out database server is completely down , we are unable to start services on SQL Server and it is not possible to restore them. Now we have our CMS and Auditing databases on that database. We have access to the file system from which we can take mdf and ldf files and restore the same to a new database. But CMS database cannot be restored in that way. How can i restore the CMS to original state. We also have a backup for those databases i.e., .bak file. Please help
Anil,
You should be able to copy the *.MDF & *.LDF set from one SQL Server to another and mount them on another working SQL server. Once you have validated the SQL Server database is mounted properly simple update the ODBC on the BOBJ box to point to the new server. Make sure the ODBC name stays the same.
You problem is more related to back and restore/recover of SQL Server.
Regards,
Ajay -
BM 3.9 audit log on different servers (and different databases)
Hi
after migrating from bm3.8 to bm3.9 we intalled Rule Hit Logging with naudit
and mysql-database.
we do have two bm3.9 servers connected with a WAN-Link.
So we do want to keep the logs on the servers an do not want to log over the
WAN-Link.
We installed mysql on each server and configured for each server a separate
Notification and Channel object in iManager.
Still one server logs now to both databases (on both sides of the
WAN-Link).
Can anybody tell me what i have missed???
Thanks
AndrejPerhaps you have an old entry in the proxy.cfg file?
Craig Johnson
Novell Support Connection SysOp
*** For a current patch list, tips, handy files and books on
BorderManager, go to http://www.craigjconsulting.com *** -
One schema for OWB Design repository, runtime repository and target schema
Currently we have contents of OWB Design schema, runtime schema and target schema all combined into one schema of the same database in OWB 9.0.2 as well as OWB3i. We like to move to OWB10g in very near future. Can we keep the same structure for convenience of migration in OWB10g? Is it mandatory that OWB design repository (and components) must be separate from OWB run time repository (and components) and target schema? In other words is it possible and workable to use only one schema to contain OWB design repository, OWB run time repository and target schema in OWB10g environment with repositories to be situated on Oracle v9.2.0.1.0? Also what special considerations should be taken to create the database v9.2.0.1.0 and installation of OWB10g. What are the problems/side-effects to have all in one schema?
Also please let me know how to install Oracle Workflow server to be used along with OWB. Will OWB10g work with repository on Oracle database v9.2.0.1.0?
Your prompt advice will be very well appreciated.
SankarThe design repo is a metadata repo that stores all the design-time objects and so forth.
It is an architectural decision that you or your team need to decide on. There are many flexible ways to architect an OWB infrastructure.
Also, your repository users will be using the design repository on the the other DB instance to do their design work...potentially less people always hitting the target database all the time.
-Greg -
Hi experts,
I explain:
my environment has two nodes:
nodeA and nodeB (OSLINUX 64BITS)
nodeA+nodeB has a database, DB1, that is a OracleRAC between these two nodes.
OMS is installed on this database.
I want to move repository,OMS in a third node, separately from these two, and migrate database repository to this third node.
II have checked note. 382698.1
How to move OMS repository from one node to another - Step-by-Step guide.
but my version is 10.2.0.4 (not 9i) and in this note don explain how modify file AGENT_HOME>/sysman/config/emd.properties.
Can you help me, please?
Thank you for your time.
JRCOnce you move the repository, go to each agent and do this:
1) Stop agent and cleanup the status, etc...
cd <AGENT_HOME>
bin/emctl stop agent
rm -r ./sysman/emd/state/*
rm -r ./sysman/emd/upload/*
rm -r ./sysman/emd/collection/*
rm ./sysman/emd/lastupld.xml
rm ./sysman/emd/agntstmp.txt
rm ./sysman/emd/blackouts.xml
rm ./sysman/emd/protocol.ini2) Then edit the $AH/sysman/confiig/emd.properties with your new data (host:port):
REPOSITORY_URL=https://newhost.domain.com:1159/em/upload
emdWalletSrcUrl=http://newhost.domain.com:4889/em/wallets/emd3) And finally start the agent:
$AH/bin/emctl start agent
$AH/bin/emctl upload:p
PS: And be patient, it may take up to 30 minutes to re-sync with the new OMS.
+ -
Cross platform migrate repository and oms, gets "EM Key Verification Error"
Hello,
I am trying to cross platform migrate repository and oms from Sun Solaris Sparc 64 bit to Linux x86-64bit.
Environment:
Original repository Oracle 10.2.0.3, OMS 10.2.0.5, Sun Solaris Sparc 64 bit
New repositoy Oracle 10.2.0.3, OMS 10.2.0.5, Linux x86-64bit
Following metalink note:
Subject: How To Move Grid Control Oms & Repository To Another Server
Doc ID: 853293.1
to step 4:
4. Point the new OMS to the new repository
can't get over emkey corrupt issue, gets error:
2009-10-13 14:07:24,570 [AJPRequestHandler-ApplicationServerThread-11] ERROR conn.ConnectionService verifyRepositoryEx.918 - EM Key Verification Error = Em key does not match with repos verifier
2009-10-13 14:07:27,910 [Shutdown] ERROR em.notification unregisterOMS.2573 - Error unregistering: EM Key is Missing java.sql.SQLException: EM Key is Missing
The newly installed additional OMS can only point to the original repository and can't point to the newly migrated repository.
Steps I did:
Install an addition OMS to point to original repository (make sure emkey is in repository)
create a new empty database on new host with a new name (is new name caused problem?)
migrate old repository to new (use export/import)
point new OMS to new repository by modifing emoms.properties
startup oms and gets emkey corrupt error
This metalink note does not help either:
Subject: The Em Key is not configured properly. Run "emctl status emkey" for more details.
Doc ID: 817035.1
I also have a SR open but still have no help from Oracle support. I did make sure the emkey is in the repository before migrate. There may be tricks that I don't know.
Can anyone shed some light on this?
Thank you for your help in advance.
Edited by: user1062137 on Oct 16, 2009 2:14 PMHi, I'm having the same issue.
Our original installation was on Windows 32bit with the repository and OMS on the same host. I want to move the repository to a linux 64bit host to make more resource available for everything. The new database is 11.1.0.7 though. As the Grid control release is 10.2.0.5 Oracle Support have confirmed that the 11g database is OK to use.
In the log I see a lot of this:
[AJPRequestHandler-ApplicationServerThread-5] ERROR conn.ConnectionService verifyRepositoryEx.911 - Invalid Connection Pool. ERROR = ORA-01031: insufficient privileges
ORA-06512: at "SYSMAN.MGMT_USER", line 10296
ORA-06512: at "SYSMAN.SETEMUSERCONTEXT", line 17
ORA-06512: at line 1
2009-11-05 18:54:57,254 [AJPRequestHandler-ApplicationServerThread-7] ERROR conn.FGAConnection _setConnContext.326 - java.sql.SQLException: ORA-01031: insufficient privileges
ORA-06512: at "SYSMAN.MGMT_USER", line 10296
ORA-06512: at "SYSMAN.SETEMUSERCONTEXT", line 17
ORA-06512: at line 1
java.sql.SQLException: ORA-01031: insufficient privileges
ORA-06512: at "SYSMAN.MGMT_USER", line 10296
ORA-06512: at "SYSMAN.SETEMUSERCONTEXT", line 17
ORA-06512: at line 1
2009-11-05 18:54:59,973 [Shutdown] ERROR em.notification unregisterOMS.2573 - Error unregistering: EM Key is Missing
java.sql.SQLException: EM Key is Missing
Unfortunately that package is wrapped so I can't see what privs are required.
The instructions say (simplified here) either datapump or exp/imp to the new database, run some admin scripts, then update the emoms.properties with the new connection information. Thats it - start OMS again and it should work.
Thankfully I can turn the old repository back on while I sift through these logs. -
Move EM10g repository to new server
I want to move my repository to a new (more powerfull) server. I thought I could get away with an export, but that doesn't work. I end up with view that won't compile and other scary errors.
I could have done a move of the complete database (i.e.clone the old one) to start with, but I wanted to reorganize the whole thing so I really don't want the old stuff in my shiny new database. Is there a clean way to do this? Is there a migrate tool or somthing like that to export an EM repository like there is for Designer repositories?Hi ,
One way is to run the emca(Enterprise Manager Configuration Assistant) utility.
Set the following environment variables to identify the Oracle home and the system identifier (SID) for the database you want to manage:
ORACLE_HOME
ORACLE_SID
Change directory to the ORACLE_HOME/bin directory.
Start EMCA by entering the following command with any of the optional command-line arguments shown in Table 1-3:
$PROMPT> ./emca
Depending upon the arguments you include on the EMCA command line, EMCA prompts you for the information required to configure Database Control.
For example, enter the following command to configure Database Control so it will perform automatic daily backups of your database:
$PROMPT> ./emca -config dbcontrol db -backup
EMCA commands are of the form:
emca [operation] [mode] [type] [flags] [parameters]
For info about the operation , type , flags , parameters have a look at Oracle Doc titled as "Enterprise Manager Advanced Configuration"
Regards,
Simon
Maybe you are looking for
-
File to HTTP Mapping error in PI 7.1
Hi Team, In my file to HTTP scenario- Im using file content conversion at source side . under MT , i have used Record node, under Record fileds. At target side I have used External definition. To generate multiple records , Im using 0...unbound occu
-
Servlet context in application.xml does not appear to work
Hi all, I have created a very simple servlet and an index.html file with a link to it, then packaged it into a WAR file with the appropriate web.xml and ias-web.xml files. I then created an application.xml file referencing only this servlet and addin
-
Hi, I want to change font in a table. It's the only item in the document. Selecting all, and changing font doesn't change anything. It does change any text I subsequently type outside the table, but not the table itself. Adding another row to the tab
-
Publisher items not sorted.
Hi, I have a Publisher portlet content item which has a date field in it. When I create new articles using the portlet the items are not getting sorted by creation date. Articles created today are not on the top of the portlet but somewhere in the mi
-
Problem overwriting a resource
Hello All, For simplicity sake consider that my app is exactly like the SpeakHere apple demo save for the fact that I want there to be a "prepackaged" audio file that is there the first time the app is run (but is then overwritten when the user recor