HRMS API 'hxc_tpd_end.retrieve_missing_time_periods' query
There are global tables like g_time_periods, I would like to know if we can invoke SRW.USER_EXIT (‘FND SRWINIT’); from a plsql package in order to access and initialize this table.
hxc_tpd_end.retrieve_missing_time_periods is extracting some data when running from reports but it is extracting NULL when executing from plsql package.
Can you kindly throw some light on it?
I haven't used Matt Wright's scripts in this century, but I'm pretty sure that Perl still needs to go into your CGI bin and the permissions level (chmod) must be set to 755 or possibly 777 for the script to read, write & execute.
Nancy O.
Alt-Web Design & Publishing
Web | Graphics | Print | Media Specialists
http://alt-web.com/
http://twitter.com/altweb
Similar Messages
-
Which HRMS API can be used to hire a Contact?
In either 11i and/or R12.1.3, who knows which HRMS API(s) can be used to hire a contact? That is, a "person" that exists in the system but is not an ex-employee, e.g., someone's spouse, or child. This is a person that exists in PER_ALL_PEOPLE_F, with a SYSTEM_PERSON_TYPE of OTHER. We are not using iRec or any applicant functionality, so I cannot create an application for the Contact and then hire the Applicant. Instead, my requirement is to directly hire the Contact.
I am using the following APIs for other scenarios, but haven't figured out the Contact hire yet:
HR_EMPLOYEE_API.CREATE_US_EMPLOYEE - using this API for brand new hires that do not currently exist in the database
HR_EMPLOYEE_API.RE_HIRE_EX_EMPLOYEE - using this API for rehiring ex-employees that already exist as a person in the database
One more note, I am able to accomplish this task with no problems from the front-end, so I expect there must be a way to do the same from the back-end.
e.g.,
1) Navigate to Person Form
2) Find Contact
3) Change Action to "Create Employment"
4) Choose Person Type "Employee"
5) Save, which successfully hires the Contact and creates the employee record
Please help!
Thanks,
Jason GenoveseAhhh! I just took another look Clives suggestion for using at HR_EMPLOYEE_API.HIRE_INTO_JOB. While the Oracle iRep only shows the 1 HIRE_INTO_JOB procedure, a look in the database at the package uncovers a second, overloaded, HIRE_INTO_JOB procedure that should address my requirement. I believe this will work, but I'll test this out and post back with my results. Here are excerpts from the 2 signatures. The second (new) overloaded procedure should work:
-- |------------------------------< hire_into_job >---------------------------|
-- {Start Of Comments}
* This API hires an applicant as an employee.
* This API converts a person of type Applicant to a person of type Employee
* (EMP).
* <p><b>Prerequisites</b><br>
* The applicant must exist in the relevant business group and must have an
* applicant assignment with the assignment status Accepted. If person_type_id
* is supplied, it must have a corresponding system person type of EMP and must
* be active in the same business group as the applicant being changed to
* employee.
* <p><b>Post Success</b><br>
* The applicant has been successfully hired as an employee with a default
* employee assignment.
* <p><b>Post Failure</b><br>
* The applicant is not hired as an employee and an error is raised.
-- {End Of Comments}
PROCEDURE hire_into_job
(p_validate IN BOOLEAN DEFAULT FALSE
,p_effective_date IN DATE
,p_person_id IN NUMBER
,p_object_version_number IN OUT NOCOPY NUMBER
,p_employee_number IN OUT NOCOPY VARCHAR2
,p_datetrack_update_mode IN VARCHAR2 DEFAULT NULL
,p_person_type_id IN NUMBER DEFAULT NULL
,p_national_identifier IN VARCHAR2 DEFAULT NULL
,p_per_information7 IN VARCHAR2 DEFAULT NULL --3414274
,p_effective_start_date OUT NOCOPY DATE
,p_effective_end_date OUT NOCOPY DATE
,p_assign_payroll_warning OUT NOCOPY BOOLEAN
,p_orig_hire_warning OUT NOCOPY BOOLEAN
-- |----------------------------< hire_into_job - new >------------------------|
-- {Start Of Comments}
-- Description:
-- This business process converts a person of type EX_APL, EX_EMP or OTHER to a type of EMP.
-- This is achieved by:
-- o Setting the person type to EMP
-- o Creating a period of service
-- o Creating a default employee assignment
-- o Repopulating the security lists
-- Post Success:
-- The API updates the person and application and set the following out
-- parameters:
-- Name Type Description
-- p_per_object_version_number number If p_validate is false, set to
-- the new version number of the
-- person record. If p_validate is
-- true, set to the value passed in.
-- p_employee_number number If p_validate is false, set to the
-- employee number of the person. If
-- p_validate is true, set to the
-- value passed in.
-- p_assignment_id number If p_validate is false, set to the
-- assignment_id for the person.
-- p_effective_start_date date If p_validate is false, set to
-- the effective start date of the
-- updated person record. If
-- p_validate is true, set to null.
-- p_effective_end_date date If p_validate is false, set to
-- the effective end date of the
-- updated person record. If
-- p_validate is true, set to null.
-- p_assign_payroll_warning boolean Set to true if the person's date of
-- birth has not been set. Set to
-- false if the date of birth has been
-- entered. Indicates if it will be
-- possible to set the payroll
-- component on any of this person's
-- assignments.
-- p_orig_hire_warning boolean Set to true if the original date of
-- hire is not null and the person
-- type is not EMP, EMP_APL, EX_EMP or
-- EX_EMP_APL.
-- Post Failure:
-- The API does not update the person and period of service and raises an error.
-- Access Status:
-- Public.
-- {End Of Comments}
PROCEDURE hire_into_job
(p_validate IN BOOLEAN DEFAULT FALSE
,p_effective_date IN DATE
,p_person_id IN NUMBER
,p_object_version_number IN OUT NOCOPY NUMBER
,p_employee_number IN OUT NOCOPY VARCHAR2
,p_datetrack_update_mode IN VARCHAR2 DEFAULT NULL
,p_person_type_id IN NUMBER DEFAULT NULL
,p_national_identifier IN VARCHAR2 DEFAULT NULL
,p_per_information7 IN VARCHAR2 DEFAULT NULL --3414274
,p_assignment_id OUT NOCOPY NUMBER -- Bug#3919096
,p_effective_start_date OUT NOCOPY DATE
,p_effective_end_date OUT NOCOPY DATE
,p_assign_payroll_warning OUT NOCOPY BOOLEAN
,p_orig_hire_warning OUT NOCOPY BOOLEAN
); -
Calling HRMS APIs from a DotNet plateform
Hello,
I am new on this forum and I have difficulty in beginning, i don't know if it's the right category to post my question.
I'm working in a project looking for a solution of revision of the self-service's interfaces of oracle e-business suite, indeed I need an example or a document which can help me for using the HRMS APIs of oracle e-business suite via a dotNet plateform or SharePoint application.
I have to work on a middelware allowing retreiving and persisting data from oracle e-business suite database using oracle HRMS APIs like hr_appraisals_api. So haw can i call these APIs from .NET application how can i use a .NET code to call these api's ?
Can you help me please ? i'm waiting for your answers if possible and this is my e-mail address if necessary "[email protected]" .
I would be so grateful if someone can help me. Thank you in advance.
Cordially.HRMS PLSQL API can be called from Java and .Dot net support to execute Java API . This way it could be done.
I have not idea how call PLSQL API from direct .Dot net code.
Thanks -
I am trying to build a custom Form that modifies inserts or modifies data in the Oracle HRMS system. To do this I thought the APIs would be the best tool because they would preserve the integrity of the system as a last guard if all of my checks on the data failed to catch something.
I am having problems implementing this idea. I have what essentially amount to wrapper functions that take the dataset I am dealing with and send it off through the various APIs needed to place it in the propper location. Insertion of new data is compiling ok, but when I try to compile a package with a procedure in it that utalizes and update (HR_PERSON_API.UPDATE_US_PERSON) I get the following error:
Implementation Restriction: 'HR_API.G_NUMBER':
Cannot directly access remote package variable or cursor
To simplify the problem I took the procedure out of the program unit in the form and made it a stand alone and also stripped it down to just the bare minimum passing in dummy data most of the time and only using local variables rather than objects. I end up with the following code:
procedure update_person is
start_date date;
end_date date;
full_name varchar2(300);
comment_id number;
comb_warning boolean;
ass_warning boolean;
pay_warning boolean;
emp_number varchar2(200);
object_version number;
begin
object_version := 10;
emp_number := 1210;
hr_person_api.update_us_person(
p_effective_date => to_date('02-apr-2006', 'dd-mon-yyy'),
p_datetrack_update_mode => 'UPDATE',
p_person_id => 2852,
p_object_version_number => object_version,
p_employee_number => emp_number,
p_effective_start_date => start_date,
p_effective_end_date => end_date,
p_full_name => full_name,
p_comment_id => comment_id,
p_name_combination_warning => comb_warning,
p_assign_payroll_warning => ass_warning,
p_orig_hire_warning => pay_warning
end;
This gives me the error from above. If I try to create the same procedure in the database itself using a utility like TOAD and the same information to log in to the database I get no such compile error and only get errors when I run the procedure as all the values are dummy and don't actually map to anything.
Information on my version of forms builder, the db, and the application:
Forms [32 Bit] Version 6.0.8.27.0
RDBMS : 9.2.0.6.0
Oracle Applications : 11.5.8
Forms Server
Oracle Forms Version : 6.0.8.25.2
Application Object Library : 11.5.0
I have literaly been on this same problem the entire day and anyone that can help me I would reall apreciate it. At this point I am at a loss. I suspect either I am making the call to the API incorrectly, or it is simply not possible to do what I am trying to do. I hope I am just doing something wrong because as I understood it, one of the points of the API was to allow custom forms to do validated insertion of data.
thanks for any help
AndrewYour approach of using published APIs is absolutely the best / supported mechanism to propgramatically enter data into HRMS.
This is a long-known issue when calling APIs from Forms - you are not supplying values for all of the parameters that this API exposes, thus the package defaults will be evaluated but these fail over a RPC call.
To work around simply supply all the API parameters with a value to avoid defaulting. So, for those that you do not actually wish to set you can set up a local constant(s) within the form, defined with the same definition as hr_api.g_number, g_varchar2 etc (check the db package specification for these), and pass them explicitly in your call together with those that you are setting currently. -
Still getting uncaught exception in c++ API running keywords query
When I run a search based on keyword in java application, the first time, most likely the query results is returned, but for the subsequent keywords searches, the application throws the error below...
com.sleepycat.dbxml.XmlException: Uncaught exception from C++ API, errcode = INTERNAL_ERROR
at com.sleepycat.dbxml.dbxml_javaJNI.XmlQueryExpression_execute__SWIG_1(Native Method)
at com.sleepycat.dbxml.XmlQueryExpression.execute(XmlQueryExpression.java:85)
at epss.utilities.XQueryUtil.getQueryResultsByKeywords(XQueryUtil.java:168)
at epss.search.XmlContentByKeywords.getDocumentContentByKeywords(XmlContentByKeywords.java:123)
at com.epss.test.TestApp.main(TestApp.java:83)
I know one of the many things to consider fixing this problem is to make sure all berkeley db xml objects (e.g. xmlContainer, XmlManager, XmlResults, XmlQueryExpression, etc) delete() method is called on those obects once they are done to free resources etc. I've been doing all that and still getting the error. This problem doesn't happen when i run a search for based on id (attribute value).
Note: I'm not explicitly using trasanction since i turned on transaction in EnvironmentConfig to create XmlManager.
This is the method that does the query and return us the results...
* Gets the query results by keywords.
* @param keywords
* the keywords under search
* @param manager
* the object used to perform activities such as preparing XQuery
* queries
* @return the query results by keywords
public static synchronized XmlResults getQueryResultsByKeywords(
final String keywords, XmlManager manager) {
/* Represents a parsed XQuery expression. */
XmlQueryExpression expr = null;
/* Encapsulates the results of a query that has been executed. */
XmlResults results = null;
/* The query context */
XmlQueryContext context = null;
// The value
XmlValue value = null;
// Declare string variables
String query = null;
// Run logic
try {
/* Do null check */
if (manager != null) {
// Make XmlValue object
value = new XmlValue(keywords);
// Get a query context
context = manager.createQueryContext();
// Bind xquery variable value to its variable name
context.setVariableValue(DataConstants.KEYWORD, value);
// Build the query string
query = QueryStringUtil.xQueryStringByKeywords(
DataConstants.ELEMENTS, DataConstants.KEYWORD);
// Compile an XQuery expression into an XmlQueryExpression
expr = manager.prepare(query, context);
// Evaluates the XQuery expression against the containers
results = expr.execute(context);
/* Release resources */
if (results.size() == 0) {
results.delete();
results = null;
// Free the native resources
expr.delete();
// Dereference objects
expr = null;
value = null;
context = null;
query = null;
manager.delete();
manager = null;
return results;
} catch (final XmlException e) {
// Free the native resources
expr.delete();
// dereference objects
expr = null;
value = null;
context = null;
query = null;
// Write to log
WriteLog.logExceptionToFile(e);
return null;
This is the callback method that return the query string...
* Returns query keyword query string to retrive keywords.
* @param elementName The particular node under search
* @param keywords The keywords being searched under the node
* @return The string used for the query
public static synchronized String xQueryStringByKeywords(
final String elementName, final String keywords) {
/* Build query string */
final StringBuffer sb = new StringBuffer();
sb.append("let $found := false\n");
sb.append("let $terms := tokenize($");
sb.append(keywords);
sb.append(", \",\")\n");
sb.append("for $element in collection('");
sb.append(DataConstants.CONTAINER);
sb.append("')");
sb.append("/(FUNDOC | JOBDOC)");
sb.append("//");
sb.append(elementName);
sb.append("//");
sb.append("parent::*[1]");
sb.append("\nlet $found := for $term in $terms\n");
sb
.append(" return if (contains(lower-case($element), lower-case($term)))");
sb.append(" \nthen \"true\"");
sb.append(" else \"false\" \n");
sb.append(" return if ($found = \"false\") \nthen () else $element");
return sb.toString();
Edited by: user3453165 on Jan 20, 2010 7:20 AMI am using berkeley db xml 2.5.13 on windows xp. Yes that's the complete error message. I am going to add my environment class and also part of the keyword search class that extends the environment, which will give u idea about how i'm creating and using transaction. I don't explicitly use transaction. I used to explicitly use it but i thought it's redundant. So when i create the db environment, i just call envc.setTransactional(true) and pass the EnvironmentConfig object (i.e. envc) to the environment to create instance of XmlManager and this is fine. Look below and u will see what i mean. Please let me know if u need more information. Thanks for your help. Appreciate it.
Tue, 2010-01-19 10:58:27 PM
com.sleepycat.dbxml.XmlException: Uncaught exception from C++ API, errcode = INTERNAL_ERROR
at com.sleepycat.dbxml.dbxml_javaJNI.XmlQueryExpression_execute__SWIG_1(Native Method)
at com.sleepycat.dbxml.XmlQueryExpression.execute(XmlQueryExpression.java:85)
at epss.utilities.XQueryUtil.getQueryResultsByKeywords(XQueryUtil.java:166)
at epss.search.XmlContentByKeywords.getDocumentContentByKeywords(XmlContentByKeywords.java:123)
at com.epss.test.TestApp.main(TestApp.java:66)
The environment class...
package epss.core;
import java.io.File;
import java.io.FilenameFilter;
import java.io.IOException;
import com.sleepycat.db.DatabaseException;
import com.sleepycat.db.Environment;
import com.sleepycat.db.EnvironmentConfig;
import com.sleepycat.dbxml.XmlContainer;
import com.sleepycat.dbxml.XmlContainerConfig;
import com.sleepycat.dbxml.XmlManager;
import com.sleepycat.dbxml.XmlManagerConfig;
import epss.utilities.GlobalUtil;
* Class used to open and close Berkeley Database environment.
public class DatabaseEnvironment {
/** The db env_. */
private Environment dbEnv_ = null;
/** The mgr_. */
private XmlManager mgr_ = null;
/** The opened container. */
private XmlContainer openedContainer = null;
/** The new container. */
private XmlContainer newContainer = null;
/** The path2 db env_. */
private File path2DbEnv_ = null;
/** Whether we are creating or opening database environment. */
private int mode = -1;
/** Constants for mode opening or mode creation. */
private static final int OPEN_DB = 0, CREATE_DB = 1;
* Set the Mode (CREATE_DB = 1, OPEN_DB = 0).
* @param m
* the m
protected synchronized void setDatabaseMode(final int m) {
if (m == OPEN_DB || m == CREATE_DB)
mode = m;
* Gets the manager.
* @return the manager
protected synchronized XmlManager getManager() {
return mgr_;
* Gets the opened container.
* @return the opened container
protected synchronized XmlContainer getOpenedContainer() {
return openedContainer;
* Gets the new container.
* @return the new container
protected synchronized XmlContainer getNewContainer() {
return newContainer;
* Initialize database environment.
* @throws Exception
* the exception
protected synchronized void doDatabaseSetup(String container)
throws Exception {
switch (mode) {
case OPEN_DB:
// check database home dir exist
if (!(isPathToDbExist(new File(DataConstants.DB_HOME)))) {
WriteLog.logMessagesToFile(DataConstants.DB_FILE_MISSING);
cleanup();
throw new IOException(DataConstants.DB_FILE_MISSING);
} else {
// Configure database environment
configureDatabaseEnv();
// Configuration settings for an XmlContainer instance
XmlContainerConfig config = new XmlContainerConfig();
// DB shd open within a transaction
config.setTransactional(true);
// Opens a container, returning a handle to an XmlContainer obj
openedContainer = getManager().openContainer(container, config);
break;
case CREATE_DB:
// Set environment home
setDatabaseHome();
// Validate database home dir exist
if (isPathToDbExist(new File(DataConstants.DB_HOME))) {
// Configure database environment
configureDatabaseEnv();
// Configuration settings for an XmlContainer instance
XmlContainerConfig config = new XmlContainerConfig();
// Sets whether documents are validated
config.setAllowValidation(true);
// DB shd open within a transaction
config.setTransactional(true);
// The database container path
File file = new File(path2DbEnv_, container);
// Creates a container, returning a handle to
// an XmlContainer object
newContainer = getManager().createContainer(file.getPath(),
config);
newContainer.setAutoIndexing(true);
break;
default:
throw new IllegalStateException("mode value (" + mode
+ ") is invalid");
* Validate path2 db env.
* @param path2DbEnv
* the path2 db env
* @return true, if checks if is path to db env
private synchronized boolean isPathToDbExist(final File path2DbEnv) {
boolean returnValue = false;
if (!(path2DbEnv.isDirectory() || path2DbEnv.exists())) {
throw new IllegalArgumentException(DataConstants.DIR_ERROR
+ path2DbEnv.getAbsolutePath()
+ DataConstants.DOES_NOT_EXIST);
} else {
path2DbEnv_ = path2DbEnv;
// Test whether db home exist when mode is 0
if (path2DbEnv_.exists() && mode == OPEN_DB) {
// Test whether all db files exist
returnValue = true;
} else {
// Test whether db home exist when mode is 1
if (path2DbEnv_.exists() && mode == CREATE_DB) {
returnValue = true;
return returnValue;
* Set database environment home.
* @throws IOException
* Signals that an I/O exception has occurred.
private synchronized void setDatabaseHome() throws IOException {
// The base dir
File homeDir = new File(DataConstants.DB_HOME);
// If db home delete fails, throw io exception
if (!GlobalUtil.deleteDir(homeDir) && homeDir.exists()) {
WriteLog.logMessagesToFile(DataConstants.ERROR_MSG);
throw new IOException(DataConstants.ERROR_MSG);
} else {
// If delete is successful, recreate db home
final boolean success = homeDir.mkdir();
// if home dir creation is successful
if (success) {
// Construct file object
File logDir = new File(homeDir, DataConstants.LOG_DIR);
// File dbHome = new File(homeDir, DataConstants.DB_DIR);
// Create log file
boolean logCreated = logDir.mkdir();
// Create db home
// boolean dbHomeCreated = dbHome.mkdir();
if (logCreated) {
WriteLog.logMessagesToFile(homeDir.getAbsolutePath()
+ " successfully created");
} else {
WriteLog.logMessagesToFile(homeDir.getAbsolutePath()
+ " failed to create");
* Sets environment configuration and it's handlers.
* @throws Exception
* the exception
private synchronized void configureDatabaseEnv() throws Exception {
// Construct a new log file object
File logDir = new File(path2DbEnv_, DataConstants.LOG_DIR);
// The environment config
EnvironmentConfig envc = new EnvironmentConfig();
// estimate how much space to allocate
// for various lock-table data structures
envc.setMaxLockers(10000);
// estimate how much space to allocate
// for various lock-table data structures
envc.setMaxLocks(10000);
// estimate how much space to allocate
// for various lock-table data structures
envc.setMaxLockObjects(10000);
// automatically remove log files
// that are no longer needed.
envc.setLogAutoRemove(true);
// If environment does not exist create it
envc.setAllowCreate(true);
// For multiple threads or processes that are concurrently reading and
// writing to berkeley db xml
envc.setInitializeLocking(true);
// This is used for database recovery from application or system
// failures.
envc.setInitializeLogging(true);
// Provides an in-memory cache that can be shared by all threads and
// processes
envc.setInitializeCache(true);
// Provides atomicity for multiple database access operations.
envc.setTransactional(true);
// location of logging files.
envc.setLogDirectory(logDir);
// set the size of the shared memory buffer pool
envc.setCacheSize(500 * 1024 * 1024);
// turn on the mutexes
envc.setMaxMutexes(500000);
// show error messages by BDB XML library
envc.setErrorStream(System.err);
// File db_home = new File(path2DbEnv_, "db");
// Create a database environment
dbEnv_ = new Environment(path2DbEnv_, envc);
// Configure an XmlManager instance via its constructors
XmlManagerConfig mgrConf = new XmlManagerConfig();
mgrConf.setAllowExternalAccess(true);
mgrConf.setAllowAutoOpen(true);
// Create xml manager object
mgr_ = new XmlManager(dbEnv_, mgrConf);
mgr_.setDefaultContainerType(XmlContainer.NodeContainer);
* This method is used to close the database environment freeing any
* allocated resources that may have been held by it's handlers and closing
* any underlying subsystems.
* @throws DatabaseException
* the database exception
protected synchronized void cleanup() throws DatabaseException {
if (path2DbEnv_ != null) {
path2DbEnv_ = null;
if (newContainer != null) {
newContainer.delete();
newContainer = null;
if (openedContainer != null) {
openedContainer.delete();
openedContainer = null;
if (mgr_ != null) {
mgr_.delete();
mgr_ = null;
if (dbEnv_ != null) {
dbEnv_.close();
dbEnv_ = null;
// This is the keyword search class...
public final class XmlContentByKeywords extends DatabaseEnvironment {
public synchronized Document getDocumentContentByKeywords(String keywords)
throws Exception {
// Encapsulates the results of a query that has been executed.
XmlResults results = null;
// The manager
XmlManager manager = null;
// Run the logic
if (keywords != null) {
try {
// Flag to open db
final int OPEN_DB = 0;
// The keywords content
Document keywordsContent = null;
// Open db connection
try {
// Get database instance
setDatabaseMode(OPEN_DB);
// Open this container in db environment
doDatabaseSetup(DataConstants.CONTAINER);
} catch (Exception ex) {
// Create error node with error message
keywordsContent = Wrapper.createErrorDocument(ex
.getMessage());
// Return the error node doc
return keywordsContent;
// Manager instance
// final XmlManager manager = getManager();
manager = getManager();
// Transaction instance
// final XmlTransaction txn_ = getTxn();
// The map
Map<String, Document> map = null;
// The temp map
Map<String, Document> tempMap = null;
// Return the query results
results = XQueryUtil.getQueryResultsByKeywords(keywords, manager);
// use results here...
// close results when done
results.delete();
results = null;
manager.delete();
manager = null;
} -
API method to query multicast address and port of a cluster
I am new to Tangosol and I'm looking to write a quick Java dashboard that will display current information on our cluster, such as the clustername and the multicast address and port that is being used. Looking at the API I've found the Cluster object and I'm able to get the memeberset and the clustername.But I have not been able to find a way to query the multicast address and port being used by the cluster.
Any help or links would be appreciated.
Thanks,
LenHi Len,
All this information could be retrieved via JMX. Please see this Wiki page. Detailed documentation about the semantics of all exposed attributes and method could be found in the Javadoc for the <a href ="http://tinyurl.com/r75sy">Registry interface</a>.
Regards,
Gene -
Connect to Linkedin API with Power Query
Hello
Is there a way to create Power Query queries that connect to Linkedin API permanently (like Power Query does with Facebook)? I know it is possible to access Linkedin API the way Shish Shridar did it, but it is pretty limited and thus frustrating (see his
article entitled "Analyzing LinkedIn Data using PowerBI" on his blog).
I am sorry to ask without more technical details, but I am pretty new to Power query. I guess it has something to do with OAuth2 authentification not being implemented in Power query...
I would be delighted If someone would be kind enough to provide me some insight on this issue !Thanks for your answer.
What if I create an app to get the required access token etc? I know an excel add-on could be considered as a Twitter app, and thus be able to connect to the website's API (I'm thinking about Analytics for Twitter 2013 for instance) - is there any way to
do the same with Linkedin?
I guess this is far beyond my capacities for now, but any insight would be very much appreciated !
[EDIT]
I did a little more research... I created a Linkedin app and then followed the steps described on the official documentation to enable it to make authenticated API calls to LinkedIn using OAuth 2.0 (I cannot use hyperlink for now, but here is the full link
to the official doc : https://developer.linkedin.com/docs/oauth2)
Maybe some VBA would be able to request an authorization code following this type of URL : https://www.linkedin.com/uas/oauth2/authorization?response_type=code&client_id=MYCLEINTID&state=STATE&redirect_uri=MYREDIRECTURL
- Then the user will be presented with LinkedIn's authentication dialog box. Is VBA able to fill in this login form?
-If it is, then it should get the code displayed in the redirection URL, which looks like :
MYREDIRECTURL?code=THECODETOGETWITHVBA&state=STATE
-If VBA could, then it just has to go to this new URL : https://www.linkedin.com/uas/oauth2/accessToken?grant_type=authorization_code&code=THECODETOGETWITHVBA&redirect_uri=MYREDIRECTURL&client_id=MYCLIENTID&state=STATE&client_secret=MYCLIENTSECRET
-At this point, the last URL returns the access token, which could then be stored somewhere in Excel and thus used in Power Query (pretty easy to do using a headers like this:
Headers=[#"Authorization"="Bearer Access Token"]])
Hope someone will see this and tell me if it is feasible and likely to succeed. -
API to access query structure / bad performance Bex query processor
Hi, we are using a big P&L query structure. Each query structure node selects a hierarchy node of the account.
This setup makes the performance incredible bad. The Bex query processor caches and selects per structure node - which creates an awful mass of unnecessary SQL statements. (It would be more useful to try to merge the SQL statements as far as possible with an group by account to generate bigger SQL statements.)
The structure is necessary to cover percentage calculations in the query, the hierarchy is used to “calculate” subtotals by selecting different nodes on different levels.
I am searching now for a different approach to cover the reporting requirement - or - for a API to generate out of the master structure smaller query structures per area of the P&L. It there any class to access the query structure?
We tried already to generate data entries per node level (duplicating one data record per node where it appears with an characteristic for the node name). But this approach generates too many data records.
Not using hierarchy nodes would make the maintenance terrible. To generate "hard" selections in the structure out of the hierarchy an API to change the structure be also useful.The problem came from a wrong development of exit varibale used in Analysis Authorization
Edited by: SSE-BW-Team SSE-BW-Team on Feb 28, 2011 1:46 PM -
Hi Everyone,
I need help in writng a pl/sql script(API) to move a thousand employees or more from an old business group to a new business group.
I am having a challenge doing this as it does not pick individual employees,
What can i include in my API to pick new and existing employees and move them from a business group to another
Kindly assist
thank you.I believe HR has embedded functionality - called Mass Move - that should be able to achieve this. You will probably need to look at HR documentation to see how this works.
MOS Doc 166363.1 - How to Setup Mass Move
HTH
Srini -
Query/Read Interface for API
Is there a query/read interface from the HRMS API's or from the TCA API's?
We are using customers online to maintain data , but I want to have a custom application to view the data.
Obvious choice would be through the TCA API's but I cannot find any Query/Read Interfaces there.
Oracle's Docs seem to say it's possible so could somebody point me in the right direction please
cheers
DominicAnyone have any idea of it ?
-
Detailed Documentation on APIs in Oracle HRMS
Hi,
I'm interfacing data into Oracle HRMS. Once the data from the flat files is put into the Intermedeate tables, the API and Data pump need to be run.
Could anybody tell me where I could find detailed documentation about
1)the HRMS API's,
2)the Data pump and
3)which API handles which table's data?
please put the link in this post or mail it into this account:
[email protected]
Thanks in Advance
KPYou can refer metalink site and use eTRM site for best information on APIs
giridharan d
[email protected] -
Dear All,
We are using Oracle EBS R12.1.1 and we have users on HRMS, I have an query to retrieve the following information from HRMS tables in csv format:
Employee Number, First Name, Middle Name, Last Name, Job Title, mobile, office phone no, extension, Office Location (which project), department, Email, Address, City, Country.
Note: Actually I want to sync users information from HRMS to active directory
The challenge is the employee change his location (transfer between project to project), his transfer information is always correct in Oracle EBS (payroll) but not in active directory.Hi Mohsin,
What are you looking for here ?
Cannot understand what exactly you're expecting from the group.
Cheers,
Vignesh -
Exporting report as PDF and CSV formats same time with out executing DB query twice
Post Author: cpriyanka
CA Forum: Exporting
I am using Crystal Report 9.0 version and Java.
I am getting "PrintOutputController" from "ReportClientDocument"
And then by calling export(ReportExportFormat.PDF) generating PDF format report.
Now I need to generate both PDF and CSV format files same time. How can it be done?
My understanding is when we call "export" it does the DB query execution and other functionality.
In that case, if I call "export" two times with two different formats, then DB query will be executed twice and that takes lot of time.
To avoid, is there a way I can all some API so DB query executed once, but I can export report in to multiple formats?
I appreciate your help.
Thanks.Post Author: cjmorris1201
CA Forum: Exporting
Hello,
Are you using the "pull" or "push" method for your crystal reports? If you are using the "pull" method (the report itself executes the sql) then I believe there is no way around having the query execute twice since it is fired off each time you open and export the report.
If you use the "push" method, however, then you can just create the recordset/dataset and then set the datasouce once for the report.
Here's a broad overview of push and pull though the Crystal Report Viewer is used. The viewer may or may not be applicable in your case:
http://aspalliance.com/265_Crystal_Report_for_Visual_Studio_NET#Page5
Regards, Carl -
How to filter the Rest Api data based on Taxanomy columns
Hi Everyone,
We are using SharePoint2010 Standard Edition.
I wanted get the library details through REST Api. I am using as below:
https://SiteUrl/_vti_bin/listdata.svc/Documents?$filter=Title eq 'SharePointDoc'
Here I am able to get the info regarding "SharePointDoc". But when I am trying to get the details from Taxonomy filter, it didn't.
Can anyone please tell me how can we filter based on Taxanomy fields.
Thanks in Advance
KrishnasandeepHi,
I understand that you wanted to filter the Rest Api data based on Taxanomy columns.
Per my knowledge, in SharePoint 2010 , not all types of column are available via REST, most annoyingly managed metadata columns are amongst this group of unsupported column types.
However, in SharePoint 2013, we can filter list items based on taxonomy (managed metadata) columns.
Taxonomy fields can be now called via REST API using CAML query in REST calls.
Here is a great blog for your reference:
http://www.cleverworkarounds.com/2013/09/23/how-to-filter-on-a-managed-metadata-column-via-rest-in-sharepoint-2013/comment-page-1/
You’d better to change the REST calls and the CAML query to check whether it works in SharePoint 2010.
More information:
http://platinumdogs.me/2013/03/14/sharepoint-adventures-with-the-rest-api-part-1/
Best Regards,
Linda Li
Linda Li
TechNet Community Support -
Please help me to find the problem in this Query
hi
This is my xml structure,
<?xml version="1.0">
<rss version ="2.0">
<channel>
<Lgs>
<Lg>
<Fid>16447</Fid>
<Fname>Sam</Fname>
</Lg>
</Lgs>
<Lgs>
<Lg>
<Fid>206</Fid>
<Fname>David</Fname>
</Lg>
</Lgs>
<Lgs>
</channel>
</rss>
I need to limit the results ie, at a time retrieve only "10" results , also that should be ordered by node "Fid".
I am using PHP-API
and following query is i have used...
$query = "let $hits := (for $hit in collection('gview.dbxml')/rss/channel/Lgs/Lg order by $hit/Fid return $hit) return subsequence($hits, 1, 10)";
$results = $mgr->query($query);
getting the error,
Fatal error: Uncaught exception 'xmlexception' with message 'Error: syntax error, unexpected :=, expecting <end of file> [err:XPST0003], <query>:1:8' in /usr/local/apache/htdocs/add_loc_file.php:99 Stack trace: #0 /usr/local/apache/htdocs/add_loc_file.php(99): xmlmanager->query('let := (for i...') #1 {main} thrown
Message was edited by:
user647571The "$hits" in your string is being interpreted by PHP as a variable reference and substituted out before DB XML gets the query.
John
Maybe you are looking for
-
JPA on OC4J 10.1.3.0
Hi, Does anyone know which of the following options is preferred when using JPA/TopLink Essentials on OC4J 10.1.3.0: 1. Delete persistency.jar from the OC4J installation and include the latest TopLink Essentials jar-files to the libs of the applicati
-
Change line item in PR, PO, Sch agreement, contract to trigger workflow.
Hi. The requirement is whenever there is a change in a line item. The workflow is to trigger. When I use the events: SIGNIFICANLYCHANGED IN BUS2009 CHANGED IN BUS2012 (ENTRY MAINTAINED IN SWEC) Z_CHANGE IN BUS2013 (ENTRY MAINTAINED IN SWEC) CHANGED I
-
Error 2148073513 When Attempting To Digitally Sign In Acrobat 11 Standard
I'm attempting to use Acrobat 11 Standard to digitally sign a PDF document with a 2048 bit certificate from our internal certificate authority, and I'm receiving the following error: Error encountered while signing: The Windows Cryptographic Service
-
Different ID names for Apple and iCloud
I apparently created my iCloud account with a different email address than I did my Apple account. I thought I knew which email address I might have used to create the iCloud account, but I am not being allowed to sign in. I there any resolution to t
-
How to make table cell have certain width
Hi i have 3 cells when i write text in any cell it effects the width of other cells !!! how to make every cell have certain? i mean i want to wrap the text not to effect the cell width thanks in advance.