google-cloud-pubsub's questions - Malay 1answer

0 google-cloud-pubsub questions.

We are considering the alternatives to PubSub, due to high costs. For some of our low-value and high-volume data it can get quite expensive. The plan of using PubSub: Run the service in Kubernetes ...

I've got a function that create topics and subscription when my server boots. The problem is that I can't create subscription linked to a newly created topic, the Google PubSub server throws a NOT ...

I'm working with an application that interacts with Google Cloud PubSub. It works fine in normal scenario but I want to enable proxy support so I was going through Publisher.Builder and Subscriber ...

I am writing a simple Dataflow pipeline in Java: PubsubIO -> ConvertToTableRowDoFn -> BigQueryIO The pipeline is working -- data arrives in BigQuery as expected -- but I'm seeing OutOfMemoryErrors in ...

Why does "gcloud pubsub subscriptions pull" often report an empty message list while there are messages to be acknowleged in the subscription? See below the effect. The message 118870127432164 is ...

I'm using Google Cloud Platform to transfer data from an Azure server to a BigQuery table (working nice and smoothly, functionally speaking). The pipeline looks like this: Dataflow streaming pipeline ...

I sometimes get the following error when creating a subscription: Insufficient tokens for quota 'administrator' and limit 'CLIENT_PROJECT-100s' of service 'pubsub.googleapis.com' for consumer '...

I have a Streaming dataflow running to read the PUB/SUB subscription. After a period of a time or may be after processing certain amount of data, i want the pipeline to stop by itself. I don't want ...

How does one return a PubsubMessage through protoRPC ? Pub/sub message example, .{ "data": string, "attributes": { string: string, ... }, "messageId": string, "publishTime": ...

When stopping a service using Pub/Sub and running on Google App Engine, the following stacktrace is received. System.ObjectDisposedException: Safe handle has been closed at System.Runtime....

I have a system that adds dynamically created instances with ephemeral IP addresses to GCP cloud DNS when they start. However, I need to remove them from the DNS when they shutdown, whether through ...

I have an IoT device that should be able to receive push notifications, but I don't want the notifications to be persisted in any way. The device is either currently online and receives the ...

Desde Android deseo descargar mensajes almacenados en el modulo PubSub de Google Cloud, en el momento de estar realizando el pull se recibe el siguiente error en tiempo de ejecucion: Android ...

I am looking for a procedure to create bulk topics and subscribers using script or program. I have all topics&subscribers create commands in script. How to execute the script?

Problem: My use case is I want to receive messages from Google Cloud Pub/Sub - one message at a time using the Python Api. All the current examples mention using Async/callback option for pulling the ...

I have developed go flex application to pull pubsub messages asynchronously. I have deployed this code in google cloud app engine. Its working fine after deployment. After some time (lets say, 12 hrs ...

I have defined a few different Cloud Dataflow jobs for Python in the Google AppEngine Flex Environment. I have defined my requirements in a requirements.txt file, included my setup.py file, and ...

It is possible for a service to do long polling on a pub/sub subscription. That obviously requires a TCP connection to be constantly open between the pub/sub service and the client. Is there any way ...

I found in the Google Pub/Sub documentation that a published message can wait for up to 7 days for the delivery to the subscriber and then it gets deleted. But is there a way to make this time shorter,...

I've written an Apache Beam job using Scio with the purpose of generating session ids for incoming data records and then enriching them in some manner, before outputting them to BigQuery. Here's the ...

I am trying to publish multiple message at time (around 50) but Pub/Sub is giving is Deadline Exceeded at /user_code/node_modules/@google-cloud/pubsub/node_modules/grpc/src/client.js:55 error. const ...

Using Android Studio 3.0.1, Try to pull messages from Gloogle Cloud PubSub module, using the next code to retrieve the messages: PullMessage.java import android.app.Activity; import android.content....

We have a PubSub topic with events sinking into BigQuery (though particular DB is almost irrelevant here). Events can come with new unknown properties that eventually should end up as separate ...

I'm writing a Java service that publishes to and subscribes from Google Pub/Sub. I want to focus on the subscribe part. Somehow it works for every other message. Not -ish, literally every other ...

In Google PubSub, the publish call from the client can be called asynchronously. Because of this, I would think that it would be possible to have multiple publish requests triggered and sent to the ...

I am wondering if there is a way to limit the rate at which Alpakka pulls Google PubSub. Now it pulls very intensively which is fine when integrating with Google Cloud PubSub but it is really ...

I'm approacing to Google Pubsub, and i created a simple subscriber with python (3.6) for read messages from a queue on GCP Pubsub (v0.35.1). I followed the tutorial on GCP documentation but something ...

Having an issue with identifying optimal settings for on-premises GCP PubSub Publisher client. Start getting "Deadline Exceeded" within hours of start sending data, with variation of "Stream removed" ...

Created a test topic on GC pubsub and can create pull subscriptions no problem but as soon as I try to create a URL endpoint subscription I get a "pubsub error INVALID_ARGUMENT" Params Subscription ...

I could not find the class ProjectSubscriptionName/MessageReceiver and eclipse reporting ProjectSubscriptionName/MessageReceiver class can not resolved error tough the related google-cloud-pubsub jar ...

Is it possible to delete data from a BigQuery table while loading data into it from an Apache Beam pipeline. Our use case is such that we need to delete 3 days prior data from the table on the basis ...

I am trying to ingest data from a 3rd party API into a Dataflow pipeline. Since the 3rd party doesn't make webhooks available, I wrote a custom script that constantly polls their endpoint for more ...

I am using grafana to monitoring my service, I have new service that deal with google pubsub using spring-cloud-gcp-starter-pubsub, I wanna monitoring the pubsub subscription to show how many message ...

The docs for PubSub state that the max payload after decoding is 10MB. My question is whether or not it is advantageous to compress the payload at the publisher before publishing to increase data ...

I want to use the Google-provided template for streaming data from Pubsub to Bigquery, although I want the streaming to be done at 1 row per second. How can I update the streaming speed? Any advice ...

How do I use google pubsub with Node.js's Socket.IO. Documentation isn't very good for Node.js. Can anyone provide an example?

I've been trying to get the Google Pub/Sub Java libraries to work using the Quickstart guides. None of them work as written, at least not for me. I'm working in IntelliJ, Maven framework, OSX, Java 8. ...

My task : I cannot speak openly about what the specifics of my task are, but here is an analogy : every two hours, I get a variable number of spoken audio files. Sometimes only 10, sometimes 800 or ...

am trying to publish my CRM data to google cloud using pubsub service. After 30 to 40 minutes the application which am using for pubsub getting memory leakage. Can anyone help me to solve this? Here ...

I want to store IoT event data in Google Cloud Storage, which will be used as my data lake. But doing a PUT call for every event is too costly, therefore I want to append into a file, and then do a ...

I have a Dataflow pipeline reading event data from a PubSub topic. When receiving a message, i do a transformation step, to fit the event data to my desired BigQuery schema. However, if my created ...

I have verified access to my domain (http://ec2-54-67-124-251.us-west-1.compute.amazonaws.com:8080/) and registered it in APIs and Services. But I still get "pubsub error INVALID_ARGUMENT" when ...

In my scenario I'm scheduling tasks using PubSub. This is up to 2.000 PubSub messages that are than consumed by a Python script that runs inside a Docker Container within Google Compute Engine. That ...

We are building a serverless platform consisting of Android and iOS apps build using React Native and on the backend we use Google Cloud Functions and Firebase. Given that some actions are handled by ...

Is Pub/Sub significantly faster way of communicating between, say, Kubernetes Engine (GKE) api server and a Cloud Function (GCF)? Is it possible to use Pub/Sub to have such communication between GKE ...

I'm calling Pub/Sub via a REST request. I'm trying to put columnised data on a topic on Pub/Sub, which then goes into DataFlow, and finally into Big Query where a Table has been defined. This is the ...

folks, I am getting the following error message using Pubsub: Exception in thread "main" com.google.cloud.pubsub.PubSubException: io.grpc.StatusRuntimeException: UNAVAILABLE: HTTP/2 error code: ...

My requirement to batch process/streaming files through pubsub into google cloud storage using python scripts. I have used below python files and able to see the messages published from topic to ...

I apologize in advance in asking this question. It must be something very silly that I am overlooking. I am a beginner to GCP. When I try to create a job using the GUI and google pubsub to bigquery ...

I have a java client of google-pubsub api. I sent 4 messages at the same time to the pubsub. My subscriber is triggered once an hour and loop on the pubsub till it's empty. Nevertheless I see the ...

Related tags

Hot questions

Language

Popular Tags