Compare commits

..

101 Commits
7.1 ... 5.2

Author SHA1 Message Date
Alex Vandiver
b20797ed9c Release Zulip Server 5.2. 2022-05-04 00:27:55 +00:00
Alex Vandiver
e637ff626d version: Fix latest major version and blog post link. 2022-05-04 00:27:08 +00:00
Alex Vandiver
ca4cf94e79 settings: Remove misleading and irrelevant comment.
This comment was _originally_ for the `default` memcached cache, back
when it was added all of the way back in 0a84d7ac62.  9e64750083
made it a lie, and edc718951c made it even more confusing when it
removed the `default` cache configuration block, leaving the wrong
comment next to the wrong cache configuration block.

Banish the comment.

(cherry picked from commit 7cc9b93b91)
2022-05-03 16:12:08 -07:00
Alex Vandiver
789e960672 test_link_embed: Remove unnecessary TEST_CACHES.
The only purpose of this seems to be to not have to reset the cache;
fae59502ab added it without any explanation for why it is necessary.

Remove it, and explicitly flush the cache in the one place where it is
necessary.

(cherry picked from commit 9030d53acb)
2022-05-03 16:12:08 -07:00
Alex Vandiver
572138d983 caches: Remove unnecessary "in-memory" cache.
This cache was added in da33b72848 to serve as a replacement for the
durable database cache, in development; the previous commit has
switched that to be the non-durable memcached backend.

The special-case for "in-memory" in development is mostly-unnecessary
in contrast to memcached -- `./tools/run-dev.py` flushes memcached on
every startup.  This differs in behaviour slightly, in that if the
codepath is changed and `run-dev` restarts Django, the cache is not
cleared.  This seems an unlikely occurrence, however, and the code
cleanup from its removal is worth it.

(cherry picked from commit 56058f3316)
2022-05-03 16:12:08 -07:00
Alex Vandiver
df8ac69d90 caches: Cache link preview data in memcached, not in PostgreSQL.
The choice to cache these in the database dates back to c93f1d4eda,
with the comment added in da33b72848 while working around the
durability of the "database" cache in local development.

The values were stored in a durable cache, as they needed to be
ensured to persist between when they were inserted in
`get_link_embed_data` and when they were used in
`render_incoming_message` via `link_embed_data_from_cache`.

However, database accesses are not fast compared to memcached, and we
wish to avoid the overhead of the database connection from the
`embed_links` worker.  Specifically, making the connection may not be
thread-safe -- and in low-memory (and Docker) configurations, all
workers run as separate threads in a single process.  This can lead to
stalled database connections in `embed_links` workers, and failed
previews.

Since the previous commit made the durability of the cache no longer
necessary, this will have minimal effect; at worst, posting the same
URL twice, on either side of an upgrade, will result in two preview
fetches of it.

(cherry picked from commit 04ca2e92f7)
2022-05-03 16:12:08 -07:00
Alex Vandiver
9a9c6730ff preview: Use cache only as a non-durable cache, not an IPC.
The `get_link_embed_data` / `link_embed_data_from_cache` pair as
introduced in c93f1d4eda uses the cache
as a temporary store inside of the `embed_links` worker; this means
that it must be durable storage, or the worker will stall and re-fetch
the same links to preview them.

Switch to plumbing through the fetched URL embed data as an parameter
to the Markdown evaluation which uses them, rather than using the
cache as an intermediary.  This frees up the cache to be merely a
non-durable cache.

As a side-effect, this removes get_cache_with_key, and
link_embed_data_from_cache which was its only callsite.

(cherry picked from commit 351bdfaf78)
2022-05-03 16:12:04 -07:00
Alex Vandiver
5ff82c82ae preview: Use a dataclass for the embed data.
This is significantly cleaner than passing around `Dict[str, Any]` all
of the time.

(cherry picked from commit 327ff9ea0f)
2022-05-03 16:10:25 -07:00
Alex Vandiver
00b3da0a0c populate_db: Remove unnecessary pre-population of URL cache.
76deb30312 changed this to not just be the URL, but rather a
prefixed hash of the URL, but failed to update this location which
wrote to it.  This meant that this pre-population step was writing to
the wrong keys in the durable cache, and thus ineffective.

Then, da33b72848 switched the cache to be in-memory, making this
write to the wrong keys in an in-process memory store.  There is no
way to pre-fill this sort of cache, except at server start-up.

Finally, and most fundamentally, 8c0c9ca7a4 then disabled
`inline_url_embed_preview` by default, making the code entirely moot.

Remove the triply-unnecessary code.

(cherry picked from commit ede4a88b49)
2022-05-03 16:10:25 -07:00
Alex Vandiver
9ded5be2a7 cache: Make the cache_name=None behaviour clearer.
`django.core.cache.cache` is equal to
`django.core.cache.caches["default"]`; the latter is more
understandable in context.

(cherry picked from commit aaa58a49db)
2022-05-03 16:10:25 -07:00
Alex Vandiver
0d0aaf3c92 markdown: Use named parameters to add_a helper.
This has enough parameters that it benefits from making which is which
explicit.

(cherry picked from commit 661c333377)
2022-05-03 16:10:24 -07:00
Alex Vandiver
26907e1c2e markdown: Clarify url parameter of "add_a" helper.
(cherry picked from commit 452a30305d)
2022-05-03 16:10:24 -07:00
Aman Agrawal
953f3c8c1d resize: Don't use visible selector to find element states.
This change decreases the time required to open compose
after clicking a message. The amount of time reduced varies with pc.

The time reduction was around 0.4s to 0.6s for me after using a
6x CPU slowdown. This may not sound convincing but the profile
uploaded in #21979 clearly shows the root cause of having a message
click take 10s was the `:visible` query.

Fixes #21979
2022-05-03 09:36:32 -07:00
Alex Vandiver
abf82392a3 puppet: Fix non-replicated PostgreSQL 10 and 11 configuration.
6f5ae8d13d removed the `$replication` variable from the
configurations of PostgreSQL 12 and higher, but left it in the
templates for PostgreSQL 10 and 11.  Because `undef != ''`,
deployments on PostgreSQL 10 and 11 started trying to push to S3
backups, regardless of if they were configured, leaving frequent log
messages like:

```
2022-04-30 12:45:47.805 UTC [626d24ec.1f8db0]: [107-1] LOG: archiver process (PID 2086106) exited with exit code 1
2022-04-30 12:45:49.680 UTC [626d24ee.1f8dc3]: [18-1] LOG: checkpoint complete: wrote 19 buffers (0.0%); 0 WAL file(s) added, 0 removed, 0 recycled; write=1.910 s, sync=0.022 s, total=1.950 s; sync files=16, longest=0.018 s, average=0.002 s; distance=49 kB, estimate=373 kB
/usr/bin/timeout: failed to run command "/usr/local/bin/env-wal-g": No such file or directory
2022-04-30 12:46:17.852 UTC [626d2f99.1fd4e9]: [1-1] FATAL: archive command failed with exit code 127
2022-04-30 12:46:17.852 UTC [626d2f99.1fd4e9]: [2-1] DETAIL: The failed archive command was: /usr/bin/timeout 10m /usr/local/bin/env-wal-g wal-push pg_wal/000000010000000300000080
```

Switch the PostgreSQL 10 and 11 configuration to check
`s3_backups_bucket`, like the other versions.

(cherry picked from commit d891b9590a)
2022-05-02 17:58:27 -07:00
Alex Vandiver
fb9cdf0f56 compare-settings-to-template: Handle prod_settings_template renaming.
(cherry picked from commit 3476f63dca)
2022-04-28 15:46:22 -07:00
Alex Vandiver
df80303a64 compare-settings-to-template: Simplify and dedent logic.
(cherry picked from commit b6b6faa404)
2022-04-28 15:46:22 -07:00
Alex Vandiver
d7fb2292eb compare-settings-to-template: Fetch 100 per pagination.
(cherry picked from commit d205050ab0)
2022-04-28 15:46:22 -07:00
Alex Vandiver
827d1d9d3b compare-settings-to-template: Paginate through all tags.
The default page size is 30, which means this only goes back to 4.6 at
present, due to starting with `shared-...` and old `enterprise-...`
tags.

(cherry picked from commit d79776f80d)
2022-04-28 15:46:22 -07:00
Alex Vandiver
64b563e1dc prod_settings_template: Switch to double quotes in commented lines. 2022-04-28 12:40:52 -07:00
Alex Vandiver
92fdfffa4d prod_settings_template: Add some missing quotes in commented lines. 2022-04-28 12:40:50 -07:00
Alex Vandiver
1767a0bcb1 puppet: Check that certbot certs are in use before fixing them.
It is possible to have previously installed certbot, but switched back
to using self-signed certificates -- in which case renewing them using
certbot may fail.

Verify that the certificate is a symlink into certbot's output
directory before running `fix-standalone-certbot`.

(cherry picked from commit c97162e485)
2022-04-28 00:53:56 -07:00
Anders Kaseorg
6736b35f5f puppet: ‘supervisorctl stop all’ before restarting Supervisor.
This fixes a failure of the 3.4 upgrade test running on Ubuntu 20.04
with Supervisor 4.

Signed-off-by: Anders Kaseorg <anders@zulip.com>
(cherry picked from commit a7e6cb7705)
2022-04-27 12:15:07 -07:00
Mateusz Mandera
c34110f88c digest: Don't send emails to deactivated users, even if queued. 2022-04-15 14:33:16 -07:00
Mateusz Mandera
fc4102d779 test_digest: Fix typo in a comment. 2022-04-15 14:33:15 -07:00
Anders Kaseorg
4d0ddf483d actions: Delete zerver.lib.actions.
Signed-off-by: Anders Kaseorg <anders@zulip.com>
(cherry picked from commit cc30ed8ec7)
2022-04-15 10:08:19 -07:00
Anders Kaseorg
9c927e40d6 actions: Move part into zerver.lib.test_classes.
Signed-off-by: Anders Kaseorg <anders@zulip.com>
(cherry picked from commit 729019acdd)
2022-04-15 10:08:19 -07:00
Anders Kaseorg
4d21bad033 actions: Split out zerver.actions.create_realm.
Signed-off-by: Anders Kaseorg <anders@zulip.com>
(cherry picked from commit e01faebd7e)
2022-04-15 10:08:19 -07:00
Anders Kaseorg
472428621a actions: Split out zerver.actions.realm_domains.
Signed-off-by: Anders Kaseorg <anders@zulip.com>
(cherry picked from commit 53f4a395bc)
2022-04-15 10:08:19 -07:00
Anders Kaseorg
37b40df30c actions: Split out zerver.actions.realm_settings.
Signed-off-by: Anders Kaseorg <anders@zulip.com>
(cherry picked from commit 59f6b090c7)
2022-04-15 10:08:19 -07:00
Anders Kaseorg
87c58f8e23 actions: Move part into zerver.forms.
Signed-off-by: Anders Kaseorg <anders@zulip.com>
(cherry picked from commit 12de8d797e)
2022-04-15 10:08:19 -07:00
Anders Kaseorg
eb5832f7a4 actions: Split out zerver.actions.message_edit.
Signed-off-by: Anders Kaseorg <anders@zulip.com>
(cherry picked from commit eda000899b)
2022-04-15 10:08:19 -07:00
Anders Kaseorg
a9b6d9990a actions: Split out zerver.actions.muted_users.
Signed-off-by: Anders Kaseorg <anders@zulip.com>
(cherry picked from commit 5d1a5a3877)
2022-04-15 10:08:19 -07:00
Anders Kaseorg
f4fe1660f3 actions: Split out zerver.actions.bots.
Signed-off-by: Anders Kaseorg <anders@zulip.com>
(cherry picked from commit ec174dfb47)
2022-04-15 10:08:19 -07:00
Anders Kaseorg
3bb3a415a8 actions: Split out zerver.actions.message_flags.
Signed-off-by: Anders Kaseorg <anders@zulip.com>
(cherry picked from commit eb4e9fe1e7)
2022-04-15 10:08:19 -07:00
Anders Kaseorg
cca19fedf0 actions: Split out zerver.actions.reactions.
Signed-off-by: Anders Kaseorg <anders@zulip.com>
(cherry picked from commit e5500a2226)
2022-04-15 10:08:19 -07:00
Anders Kaseorg
c59eb24674 actions: Split out zerver.actions.create_user.
Signed-off-by: Anders Kaseorg <anders@zulip.com>
(cherry picked from commit cbad5739ab)
2022-04-15 10:08:19 -07:00
Anders Kaseorg
c530f1b582 actions: Split out zerver.actions.streams.
Signed-off-by: Anders Kaseorg <anders@zulip.com>
(cherry picked from commit 5fcbc412cf)
2022-04-15 10:08:19 -07:00
Anders Kaseorg
3b48bcca95 actions: Split out zerver.actions.message_send.
Signed-off-by: Anders Kaseorg <anders@zulip.com>
(cherry picked from commit 975066e3f0)
2022-04-15 10:08:19 -07:00
Anders Kaseorg
50ca78447e actions: Split out zerver.actions.user_settings.
Signed-off-by: Anders Kaseorg <anders@zulip.com>
(cherry picked from commit ec6355389a)
2022-04-15 10:08:19 -07:00
Anders Kaseorg
b4d9cd4e0f actions: Split out zerver.actions.users.
Signed-off-by: Anders Kaseorg <anders@zulip.com>
(cherry picked from commit d7981dad62)
2022-04-15 10:08:19 -07:00
Anders Kaseorg
7c5e017c14 actions: Split out zerver.actions.custom_profile_fields.
Signed-off-by: Anders Kaseorg <anders@zulip.com>
(cherry picked from commit bbce879c81)
2022-04-15 10:08:19 -07:00
Anders Kaseorg
7ba639960d actions: Move part into zerver.lib.bulk_create.
Signed-off-by: Anders Kaseorg <anders@zulip.com>
(cherry picked from commit f6a06ba6e3)
2022-04-15 10:08:19 -07:00
Anders Kaseorg
76641a5f21 actions: Move part into zerver.lib.message.
Signed-off-by: Anders Kaseorg <anders@zulip.com>
(cherry picked from commit c041b68578)
2022-04-15 10:08:19 -07:00
Anders Kaseorg
b54240d6cf actions: Move part into zerver.lib.subscription_info.
Signed-off-by: Anders Kaseorg <anders@zulip.com>
(cherry picked from commit 9dd7e34ab3)
2022-04-15 10:08:19 -07:00
Anders Kaseorg
508c676f61 actions: Split out zerver.actions.presence.
Signed-off-by: Anders Kaseorg <anders@zulip.com>
(cherry picked from commit b7adfb02f6)
2022-04-15 10:08:19 -07:00
Anders Kaseorg
b60ba10351 actions: Move part into zerver.lib.users.
Signed-off-by: Anders Kaseorg <anders@zulip.com>
(cherry picked from commit ab04068294)
2022-04-15 10:08:19 -07:00
Anders Kaseorg
b8567d8d8f actions: Split out zerver.actions.uploads.
Signed-off-by: Anders Kaseorg <anders@zulip.com>
(cherry picked from commit e230ea2598)
2022-04-15 10:08:19 -07:00
Anders Kaseorg
025219da16 actions: Move part into zerver.lib.streams.
Signed-off-by: Anders Kaseorg <anders@zulip.com>
(cherry picked from commit a29f1b39da)
2022-04-15 10:08:19 -07:00
Anders Kaseorg
5bcb52390c actions: Split out zerver.actions.user_activity.
Signed-off-by: Anders Kaseorg <anders@zulip.com>
(cherry picked from commit 6168c0110a)
2022-04-15 10:08:19 -07:00
Anders Kaseorg
90cbf900d4 actions: Split out zerver.actions.user_topics.
Signed-off-by: Anders Kaseorg <anders@zulip.com>
(cherry picked from commit df4849bb15)
2022-04-15 10:08:19 -07:00
Anders Kaseorg
ddf76baf89 actions: Split out zerver.actions.realm_emoji.
Signed-off-by: Anders Kaseorg <anders@zulip.com>
(cherry picked from commit 385616f27f)
2022-04-15 10:08:19 -07:00
Anders Kaseorg
be7169bed0 actions: Split out zerver.actions.realm_export.
Signed-off-by: Anders Kaseorg <anders@zulip.com>
(cherry picked from commit 8fc5922ebd)
2022-04-15 10:08:19 -07:00
Anders Kaseorg
bdc67055b1 actions: Split out zerver.actions.realm_icon.
Signed-off-by: Anders Kaseorg <anders@zulip.com>
(cherry picked from commit 3d7aa98c45)
2022-04-15 10:08:19 -07:00
Anders Kaseorg
f5b96c8551 actions: Split out zerver.actions.realm_logo.
Signed-off-by: Anders Kaseorg <anders@zulip.com>
(cherry picked from commit 7f088f3403)
2022-04-15 10:08:19 -07:00
Anders Kaseorg
2e48056a9c actions: Split out zerver.actions.invites.
Signed-off-by: Anders Kaseorg <anders@zulip.com>
(cherry picked from commit ca8d374e21)
2022-04-15 10:08:19 -07:00
Anders Kaseorg
e0f9f58411 actions: Split out zerver.actions.alert_words.
Signed-off-by: Anders Kaseorg <anders@zulip.com>
(cherry picked from commit 241463e215)
2022-04-15 10:08:19 -07:00
Anders Kaseorg
f29b1d3192 actions: Split out zerver.actions.default_streams.
Signed-off-by: Anders Kaseorg <anders@zulip.com>
(cherry picked from commit 1ac7496855)
2022-04-15 10:08:19 -07:00
Anders Kaseorg
b1e8ead908 actions: Split out zerver.actions.hotspots.
Signed-off-by: Anders Kaseorg <anders@zulip.com>
(cherry picked from commit 12130da339)
2022-04-15 10:08:19 -07:00
Anders Kaseorg
5cb7acec36 actions: Split out zerver.actions.realm_linkifiers.
Signed-off-by: Anders Kaseorg <anders@zulip.com>
(cherry picked from commit 975f5a3c2d)
2022-04-15 10:08:19 -07:00
Anders Kaseorg
0cb261ac6b actions: Split out zerver.actions.realm_playgrounds.
Signed-off-by: Anders Kaseorg <anders@zulip.com>
(cherry picked from commit e887abcf41)
2022-04-15 10:08:19 -07:00
Anders Kaseorg
b58a5b3bf3 actions: Split out zerver.actions.submessage.
Signed-off-by: Anders Kaseorg <anders@zulip.com>
(cherry picked from commit 3a135b04d9)
2022-04-15 10:08:19 -07:00
Anders Kaseorg
76dc8bc9f7 actions: Split out zerver.actions.typing.
Signed-off-by: Anders Kaseorg <anders@zulip.com>
(cherry picked from commit 62d3b5bfd5)
2022-04-15 10:08:19 -07:00
Anders Kaseorg
bf5f006971 actions: Split out zerver.actions.user_groups.
Signed-off-by: Anders Kaseorg <anders@zulip.com>
(cherry picked from commit 372c10f5f3)
2022-04-15 10:08:19 -07:00
Anders Kaseorg
500bd04e11 actions: Split out zerver.actions.video_calls.
Signed-off-by: Anders Kaseorg <anders@zulip.com>
(cherry picked from commit 90cae59ea6)
2022-04-15 10:08:19 -07:00
Anders Kaseorg
d35bdd312f actions: Split out zerver.lib.recipient_users.
Signed-off-by: Anders Kaseorg <anders@zulip.com>
(cherry picked from commit c136eebb33)
2022-04-15 10:08:19 -07:00
Anders Kaseorg
51d9bbca1e actions: Split out zerver.lib.user_counts.
Signed-off-by: Anders Kaseorg <anders@zulip.com>
(cherry picked from commit 703186c339)
2022-04-15 10:08:19 -07:00
Anders Kaseorg
f3c9a5019b actions: Split out zerver.lib.user_message.
Signed-off-by: Anders Kaseorg <anders@zulip.com>
(cherry picked from commit 05195c02c1)
2022-04-15 10:08:19 -07:00
Anders Kaseorg
36fa5e0385 actions: Move part into zerver.models.
Signed-off-by: Anders Kaseorg <anders@zulip.com>
(cherry picked from commit 7f00aa078e)
2022-04-15 10:08:19 -07:00
Anders Kaseorg
faea77d03f actions: Split out zerver.lib.sounds.
Signed-off-by: Anders Kaseorg <anders@zulip.com>
(cherry picked from commit 6a70f75587)
2022-04-15 10:08:19 -07:00
Anders Kaseorg
ea9ba8b24c actions: Add zerver/actions directory.
Signed-off-by: Anders Kaseorg <anders@zulip.com>
(cherry picked from commit dd8b1aaba6)
2022-04-15 10:08:19 -07:00
Greg Price
051f1c3120 docs: Update apps' compatibility threshold to 3.0, from 2.1.0.
Zulip Server 3.0 is now about 21 months old, which is more than
18 months.  Per the general policy in the "Client apps" section
below, that means it's time to drop support for older versions.

We released 4.0 in 2021-05, so around 2022-11 we can update this
further to say 4.0.
2022-04-14 11:54:44 -07:00
Alex Vandiver
fb03c3205e timeout: Add test coverage.
(cherry picked from commit e6e4b7b3ef)
2022-04-13 20:47:34 -07:00
Alex Vandiver
662396d2c5 timeout: Minor comment cleanups.
We remove the StackOverflow link because it is now so dated as to be
irrelevant -- it does not use `self.ident`, and cargo-cults the return
value of PyThreadState_SetAsyncExc.

(cherry picked from commit 04159a674c)
2022-04-13 20:47:34 -07:00
Alex Vandiver
4a5204a967 timeout: Warn if the thread did not exit.
As noted in the docstring for this function, the timeout is
best-effort only -- if the thread is blocked in a syscall, it will not
service the exception until it returns.  It can also choose to catch
and ignore the TimeoutExpired; in either case it will still be running
even after the `timeout()` function returns.

Raising a vare TimeoutExpired it still somewhat accurate, but obscures
that the backend thread may still be running along merrily.  Notice
such cases, and log a warning about them.

(cherry picked from commit 3af2c8d9a3)
2022-04-13 20:47:34 -07:00
Alex Vandiver
44a3cd8dd3 timeout: Re-raise from where the TimeoutExpired hit the thread.
Having just thrown an exception into the thread, it is often useful to
know _what_ was the slow code that we interrupted.  Raising a bare
TimeoutExpired here obscures that information, as any `exc_info` will
end there.

Examine the thread for any exception information, and use that to
re-raise.  This exception information is not guaranteed to exist -- if
the thread didn't respond to the exception in time, or caught it, for
instance.

(cherry picked from commit e714264756)
2022-04-13 20:47:34 -07:00
Alex Vandiver
efddda2609 timeout: Remove cargo-culted and impossible-to-reach code block.
The quote in question originates in python/cpython@b8b6d0c2c6, when
the code was added.  However, the code stopped having that comment,
and was no longer able to return anything but 1 or 0, starting in
python/cpython@4643c2fda1 -- Python 2.5.

Remove the block.

(cherry picked from commit 85eeaf5f18)
2022-04-13 20:47:34 -07:00
Alex Vandiver
ecfcc20351 send_email: Only warn if EMAIL_HOST_PASSWORD is unset, not "".
Some email hosts actually do want an empty password; since the default
is `None`, we should key on that, and not just being false-y.

(cherry picked from commit ee04f42897)
2022-04-13 20:46:38 -07:00
Alex Vandiver
fa68acd669 settings: Use default database_user value when looking up.
Failure to pull the default "zulip" value here can lead to
accidentally applying a `postgres_password` value which is unnecessary
and may never work.

For consistency, always skip password auth attempts for the "zulip"
user on localhost, even if the password is set.  This mirrors the
behavior of `process_fts_updates`.

(cherry picked from commit 828c9d1c18)
2022-04-13 20:44:56 -07:00
Mateusz Mandera
9c88f6c4ce scim: Temporarily stop running SCIM change operations atomically.
do_deactivate_user can't be run in an atomic block due to concerns
around revoking session in a transaction. See
62ba8e455d for more details.

Without the change in this commit, the process of deactivating a user
via SCIM is broken.
2022-04-13 16:02:04 -07:00
Mateusz Mandera
8aa6958923 docs: Fix incorrect path to SAML certs in SAML Keycloak instructions.
This was supposed to be /etc/zulip/saml/idps/
2022-04-13 16:01:25 -07:00
Alex Vandiver
88e2f64869 docs: Fix typo.
We don't suggest self-hosing, unless via a sprinkler in warm weather.
2022-04-13 11:38:35 -07:00
Alex Vandiver
c7df68eb48 check-database-compatibility: Sort and prettify output.
(cherry picked from commit 09860dc284)
2022-04-06 14:16:55 -07:00
Alex Vandiver
599094bcf5 docs: Fold FTS index updating into the upgrade step.
On the Debian 10 -> 11 upgrade, the server is running Zulip 4.x, which
lets us pass `--audit-fts-indexes` to `upgrade-zulip-stage-2` rather
than run the command as a separate step.

(cherry picked from commit 488aaef9b7)
2022-04-06 14:16:55 -07:00
Alex Vandiver
4a4be8620c docs: Upgrade Zulip before trying to fix collations.
The reindex-textual-data tool needs the venv to be cable to run;
switch the order of the last two steps, making them now match the
Debian 9 -> 10 and 10 -> upgrades.

Ref #21296.

(cherry picked from commit 1e3a6984a4)
2022-04-06 14:16:54 -07:00
Aman Agrawal
3dc29fbc76 message_edit: Fix false sub/unsub bookend on using a near link.
We were not setting the `historical` flag correctly for
messages fetched via `json_fetch_raw_message` when used didn't
have any UserMessage.

Extended relevant tests to fetch check message flags too.
2022-04-04 12:34:43 -07:00
Alex Vandiver
c49dfc5679 version: Update version after 5.1 release. 2022-04-01 23:13:55 -07:00
Alex Vandiver
08c2d9a766 Release Zulip Server 5.1. 2022-04-02 05:45:55 +00:00
Alex Vandiver
d9e7feae0a migrations: Remove the possibly-duplicated emoji re-uploading.
In 85e531e377, we duplicated this block
of migration code to fix a bug, but moving it (aka deleting the
original copy) is a cleaner solution.

(cherry picked from commit 35e27aef4a)
2022-04-01 22:31:54 -07:00
Alex Vandiver
58e29a9ca0 supervisor: 'foo:*' also matches 'foo'.
7c4293a7d3 switched to checking if the
service was already running, and use `supervisorctl start` if it was
not.

Unfortunately, `list_supervisor_processes("zulip-tornado:*")` did not
include `zulip-tornado`, and as such a non-sharded process was always
considered to _not_ be running, and was thus started, not restarted.
Starting an already-started service is a no-op, and thus non-sharded
tornado processes were never restarted.

The observed behaviour is that requests to the tornado process attempt
to load the user from the cache, with a different prefix from Django,
and immediately invalidate the session and eject the user back to the
login page.

Fix the `list_supervisor_processes` logic to match without the
trailing `:*`.

(cherry picked from commit 65e19c4fbd)
2022-04-01 17:25:29 -07:00
Alex Vandiver
266dbad737 check-database-compatibility: Ignore squashed and renamed migrations.
Fixes: #21596.
(cherry picked from commit eb31681934)
2022-04-01 17:03:38 -07:00
Tim Abbott
4db1aa75ce migrations: Repeat part of migration 0376.
The blockquote explains the motivation for this change in detail.

Fixes #21608.

(cherry picked from commit 85e531e377)
2022-04-01 15:21:05 -07:00
Alex Vandiver
fc0d5fcfd5 upgrade: Mark puppet as having started the server.
We previously used restart-server if puppet was run, as a nod to the
fact that `supervisor reread && supervisor update` will _start_
service groups that were modified, even if they were previously
stopped; this is because they are marked as `autostart=true`, which is
honored on service change.

However, upgrades want to run while there are no services running.  If
puppet is run, explicitly set the server as potentially being "up", so
that a `shutdown_server()` before migrations, if they exist, will stop
services.

(cherry picked from commit 0af00a3233)
2022-03-31 17:36:57 -07:00
Alex Vandiver
00382078ad upgrade: Move the shutdown_server calls to where they are relevant.
shutdown_server is a noop if the server is already stopped; placing
these in each block makes the logic more apparent.

(cherry picked from commit e9596637e7)
2022-03-31 17:36:55 -07:00
Tim Abbott
9313e8f909 i18n: Update translation data from Transifex. 2022-03-31 13:18:55 -07:00
Alex Vandiver
9bb31433f1 data_import: Fix bot email address de-duplication.
4815f6e28b tried to de-duplicate bot
email addresses, but instead caused duplicates to crash:

```
Traceback (most recent call last):
  File "./manage.py", line 157, in <module>
    execute_from_command_line(sys.argv)
  File "./manage.py", line 122, in execute_from_command_line
    utility.execute()
  File "/srv/zulip-venv-cache/56ac6adf406011a100282dd526d03537be84d23e/zulip-py3-venv/lib/python3.8/site-packages/django/core/management/__init__.py", line 413, in execute
    self.fetch_command(subcommand).run_from_argv(self.argv)
  File "/srv/zulip-venv-cache/56ac6adf406011a100282dd526d03537be84d23e/zulip-py3-venv/lib/python3.8/site-packages/django/core/management/base.py", line 354, in run_from_argv
    self.execute(*args, **cmd_options)
  File "/srv/zulip-venv-cache/56ac6adf406011a100282dd526d03537be84d23e/zulip-py3-venv/lib/python3.8/site-packages/django/core/management/base.py", line 398, in execute
    output = self.handle(*args, **options)
  File "/home/zulip/deployments/2022-03-16-22-25-42/zerver/management/commands/convert_slack_data.py", line 59, in handle
    do_convert_data(path, output_dir, token, threads=num_threads)
  File "/home/zulip/deployments/2022-03-16-22-25-42/zerver/data_import/slack.py", line 1320, in do_convert_data
    ) = slack_workspace_to_realm(
  File "/home/zulip/deployments/2022-03-16-22-25-42/zerver/data_import/slack.py", line 141, in slack_workspace_to_realm
    ) = users_to_zerver_userprofile(slack_data_dir, user_list, realm_id, int(NOW), domain_name)
  File "/home/zulip/deployments/2022-03-16-22-25-42/zerver/data_import/slack.py", line 248, in users_to_zerver_userprofile
    email = get_user_email(user, domain_name)
  File "/home/zulip/deployments/2022-03-16-22-25-42/zerver/data_import/slack.py", line 406, in get_user_email
    return SlackBotEmail.get_email(user["profile"], domain_name)
  File "/home/zulip/deployments/2022-03-16-22-25-42/zerver/data_import/slack.py", line 85, in get_email
    email_prefix += cls.duplicate_email_count[email]
TypeError: can only concatenate str (not "int") to str
```

Fix the stringification, make it case-insensitive, append with a dash
for readability, and add tests for all of the above.
2022-03-31 11:29:53 -07:00
Aman Agrawal
4c313ff652 navbar_alerts: Adjust height of recent topics when alert is visible.
Fixes #21619

We need to adjust height of recent topics along with the app
otherwise the container becomes separately scrollable due to
it overflowing the app height.
2022-03-31 11:27:03 -07:00
Heidi Ahlberg
9ba6664c44 i18n: Fix missing translation tags in stream creation view. 2022-03-31 10:37:36 -07:00
Anders Kaseorg
8616c2e092 changelog: Remove broken link.
Signed-off-by: Anders Kaseorg <anders@zulip.com>
(cherry picked from commit 7de1e7c477)
2022-03-30 20:38:16 -07:00
Anders Kaseorg
6ca04586c1 puppet: Do not ensure Chrony is running.
Commit f6d27562fa (#21564) tried to
ensure Chrony is running, which fails in containers where Chrony
doesn’t have permission to update the host clock.

The Debian package should still attempt to start it, and Puppet should
still restart it when chrony.conf is modified.

Signed-off-by: Anders Kaseorg <anders@zulip.com>
2022-03-30 11:38:05 -07:00
Sahil Batra
04fc7e293e settings: Fix push notifications tooltip being incorrectly shown.
We were showing the push notifications tooltip in user default
settings section even if the push notifications were configured
on the server.

The bug was because the setting value was undefined in the template
used for user default settings section, so this commit fixes the bug
by correctly passing the setting value to relevant template file.

Fixes #21602.
2022-03-30 11:31:49 -07:00
Tim Abbott
7809ecd38e version: Update version after 5.0 release. 2022-03-29 08:31:25 -07:00
4250 changed files with 298797 additions and 410748 deletions

View File

@@ -3,6 +3,3 @@
last 2 versions
Firefox ESR
not dead and supports async-functions
[test]
current Node

View File

@@ -16,12 +16,3 @@ fpr
alls
nd
ot
womens
vise
falsy
ro
derails
forin
uper
slac
couldn

View File

@@ -4,12 +4,11 @@
/docs/_build
/static/generated
/static/third
/static/webpack-bundles
/var/*
!/var/puppeteer
/var/puppeteer/*
!/var/puppeteer/test_credentials.d.ts
/web/generated
/web/third
/zulip-current-venv
/zulip-py3-venv

View File

@@ -1,5 +1,4 @@
{
"root": true,
"env": {
"es2020": true,
"node": true
@@ -15,15 +14,12 @@
],
"parser": "@babel/eslint-parser",
"parserOptions": {
"requireConfigFile": false,
"warnOnUnsupportedTypeScriptVersion": false,
"sourceType": "unambiguous"
},
"plugins": ["formatjs", "no-jquery"],
"settings": {
"formatjs": {
"additionalFunctionNames": ["$t", "$t_html"]
},
"additionalFunctionNames": ["$t", "$t_html"],
"no-jquery": {
"collectionReturningPlugins": {
"expectOne": "always"
@@ -51,9 +47,14 @@
"import/first": "error",
"import/newline-after-import": "error",
"import/no-self-import": "error",
"import/no-unresolved": "off",
"import/no-useless-path-segments": "error",
"import/order": ["error", {"alphabetize": {"order": "asc"}, "newlines-between": "always"}],
"import/order": [
"error",
{
"alphabetize": {"order": "asc"},
"newlines-between": "always"
}
],
"import/unambiguous": "error",
"lines-around-directive": "error",
"new-cap": "error",
@@ -72,7 +73,6 @@
"no-implied-eval": "error",
"no-inner-declarations": "off",
"no-iterator": "error",
"no-jquery/no-constructor-attributes": "error",
"no-jquery/no-parse-html-literal": "error",
"no-label-var": "error",
"no-labels": "error",
@@ -93,15 +93,19 @@
"no-undef-init": "error",
"no-unneeded-ternary": ["error", {"defaultAssignment": false}],
"no-unused-expressions": "error",
"no-unused-vars": ["error", {"ignoreRestSiblings": true}],
"no-use-before-define": ["error", {"functions": false}],
"no-useless-concat": "error",
"no-useless-constructor": "error",
"no-var": "error",
"object-shorthand": ["error", "always", {"avoidExplicitReturnArrows": true}],
"object-shorthand": "error",
"one-var": ["error", "never"],
"prefer-arrow-callback": "error",
"prefer-const": ["error", {"ignoreReadBeforeAssign": true}],
"prefer-const": [
"error",
{
"ignoreReadBeforeAssign": true
}
],
"radix": "error",
"sort-imports": ["error", {"ignoreDeclarationSort": true}],
"spaced-comment": ["error", "always", {"markers": ["/"]}],
@@ -110,35 +114,36 @@
"unicorn/explicit-length-check": "off",
"unicorn/filename-case": "off",
"unicorn/no-await-expression-member": "off",
"unicorn/no-negated-condition": "off",
"unicorn/no-nested-ternary": "off",
"unicorn/no-null": "off",
"unicorn/no-process-exit": "off",
"unicorn/no-useless-undefined": "off",
"unicorn/number-literal-case": "off",
"unicorn/numeric-separators-style": "off",
"unicorn/prefer-module": "off",
"unicorn/prefer-node-protocol": "off",
"unicorn/prefer-spread": "off",
"unicorn/prefer-ternary": "off",
"unicorn/prefer-top-level-await": "off",
"unicorn/prevent-abbreviations": "off",
"unicorn/switch-case-braces": "off",
"valid-typeof": ["error", {"requireStringLiterals": true}],
"yoda": "error"
},
"overrides": [
{
"files": ["web/tests/**"],
"files": ["frontend_tests/node_tests/**", "frontend_tests/zjsunit/**"],
"rules": {
"no-jquery/no-selector-prop": "off"
}
},
{
"files": ["web/e2e-tests/**"],
"files": ["frontend_tests/puppeteer_lib/**", "frontend_tests/puppeteer_tests/**"],
"globals": {
"$": false,
"zulip_test": false
}
},
{
"files": ["web/src/**"],
"files": ["static/js/**"],
"globals": {
"StripeCheckout": false
}
@@ -146,9 +151,7 @@
{
"files": ["**/*.ts"],
"extends": [
"plugin:@typescript-eslint/recommended",
"plugin:@typescript-eslint/recommended-requiring-type-checking",
"plugin:@typescript-eslint/strict",
"plugin:import/typescript"
],
"parserOptions": {
@@ -167,29 +170,36 @@
"rules": {
// Disable base rule to avoid conflict
"no-duplicate-imports": "off",
"no-unused-vars": "off",
"no-useless-constructor": "off",
"no-use-before-define": "off",
"@typescript-eslint/consistent-type-definitions": ["error", "type"],
"@typescript-eslint/array-type": "error",
"@typescript-eslint/consistent-type-assertions": "error",
"@typescript-eslint/consistent-type-imports": "error",
"@typescript-eslint/explicit-function-return-type": [
"error",
{"allowExpressions": true}
],
"@typescript-eslint/member-ordering": "error",
"@typescript-eslint/no-duplicate-imports": "error",
"@typescript-eslint/no-duplicate-imports": "off",
"@typescript-eslint/no-explicit-any": "off",
"@typescript-eslint/no-extraneous-class": "error",
"@typescript-eslint/no-non-null-assertion": "off",
"@typescript-eslint/no-parameter-properties": "error",
"@typescript-eslint/no-unnecessary-condition": "off",
"@typescript-eslint/no-unnecessary-qualifier": "error",
"@typescript-eslint/no-unused-vars": ["error", {"varsIgnorePattern": "^_"}],
"@typescript-eslint/no-unsafe-argument": "off",
"@typescript-eslint/no-unsafe-assignment": "off",
"@typescript-eslint/no-unsafe-call": "off",
"@typescript-eslint/no-unsafe-member-access": "off",
"@typescript-eslint/no-unsafe-return": "off",
"@typescript-eslint/no-unused-vars": ["error", {"ignoreRestSiblings": true}],
"@typescript-eslint/no-use-before-define": ["error", {"functions": false}],
"@typescript-eslint/no-use-before-define": "error",
"@typescript-eslint/no-useless-constructor": "error",
"@typescript-eslint/prefer-includes": "error",
"@typescript-eslint/prefer-string-starts-ends-with": "error",
"@typescript-eslint/promise-function-async": "error",
"import/no-cycle": "error",
"@typescript-eslint/unified-signatures": "error",
"no-undef": "error"
}
},
@@ -200,7 +210,7 @@
}
},
{
"files": ["web/e2e-tests/**", "web/tests/**"],
"files": ["frontend_tests/**"],
"globals": {
"CSS": false,
"document": false,
@@ -215,7 +225,7 @@
}
},
{
"files": ["web/debug-require.js"],
"files": ["tools/debug-require.js"],
"env": {
"browser": true,
"es2020": false
@@ -229,27 +239,20 @@
}
},
{
"files": ["web/shared/**", "web/src/**", "web/third/**"],
"files": ["static/**"],
"env": {
"browser": true,
"node": false
},
"globals": {
"ZULIP_VERSION": false
},
"rules": {
"no-console": "error"
},
"settings": {
"import/resolver": {
"webpack": {
"config": "./web/webpack.config.ts"
}
}
"import/resolver": "webpack"
}
},
{
"files": ["web/shared/**"],
"files": ["static/shared/**"],
"env": {
"browser": false,
"shared-node-browser": true
@@ -260,14 +263,13 @@
{
"zones": [
{
"target": "./web/shared",
"target": "./static/shared",
"from": ".",
"except": ["./node_modules", "./web/shared"]
"except": ["./node_modules", "./static/shared"]
}
]
}
],
"unicorn/prefer-string-replace-all": "off"
]
}
}
]

View File

@@ -1,10 +0,0 @@
---
name: Issue discussed in the Zulip development community
about: Bug report, feature or improvement already discussed on chat.zulip.org.
---
<!-- Issue description -->
<!-- Link to a message in the chat.zulip.org discussion. Message links will still work even if the topic is renamed or resolved. Link back to this issue from the chat.zulip.org thread. -->
CZO thread

View File

@@ -1,17 +0,0 @@
---
name: Bug report
about: A concrete bug report with steps to reproduce the behavior. (See also "Possible bug" below.)
labels: ["bug"]
---
<!-- Describe what you were expecting to see, what you saw instead, and steps to take in order to reproduce the buggy behavior. Screenshots can be helpful. -->
<!-- Check the box for the version of Zulip you are using (see https://zulip.com/help/view-zulip-version).-->
**Zulip Server and web app version:**
- [ ] Zulip Cloud (`*.zulipchat.com`)
- [ ] Zulip Server 7.0+
- [ ] Zulip Server 6.0+
- [ ] Zulip Server 5.0 or older
- [ ] Other or not sure

View File

@@ -1,6 +0,0 @@
---
name: Feature or improvement request
about: A specific proposal for a new feature of improvement. (See also "Feature suggestion or feedback" below.)
---
<!-- Describe the proposal, including how it would help you or your organization. -->

View File

@@ -1,14 +0,0 @@
blank_issues_enabled: true
contact_links:
- name: Possible bug
url: https://zulip.readthedocs.io/en/latest/contributing/reporting-bugs.html
about: Report unexpected behavior that may be a bug.
- name: Feature suggestion or feedback
url: https://zulip.readthedocs.io/en/latest/contributing/suggesting-features.html
about: Start a discussion about your idea for improving Zulip.
- name: Issue with running or upgrading a Zulip server
url: https://zulip.readthedocs.io/en/latest/production/troubleshooting.html
about: We provide free, interactive support for the vast majority of questions about running a Zulip server.
- name: Other support requests and sales questions
url: https://zulip.com/help/contact-support
about: Contact us — we're happy to help!

View File

@@ -1,43 +1,11 @@
<!-- Describe your pull request here.-->
<!-- What's this PR for? (Just a link to an issue is fine.) -->
Fixes: <!-- Issue link, or clear description.-->
**Testing plan:** <!-- How have you tested? -->
<!-- If the PR makes UI changes, always include one or more still screenshots to demonstrate your changes. If it seems helpful, add a screen capture of the new functionality as well.
**GIFs or screenshots:** <!-- If a UI change. See:
https://zulip.readthedocs.io/en/latest/tutorials/screenshot-and-gif-software.html
-->
Tooling tips: https://zulip.readthedocs.io/en/latest/tutorials/screenshot-and-gif-software.html
-->
**Screenshots and screen captures:**
<details>
<summary>Self-review checklist</summary>
<!-- Prior to submitting a PR, follow our step-by-step guide to review your own code:
https://zulip.readthedocs.io/en/latest/contributing/code-reviewing.html#how-to-review-code -->
<!-- Once you create the PR, check off all the steps below that you have completed.
If any of these steps are not relevant or you have not completed, leave them unchecked.-->
- [ ] [Self-reviewed](https://zulip.readthedocs.io/en/latest/contributing/code-reviewing.html#how-to-review-code) the changes for clarity and maintainability
(variable names, code reuse, readability, etc.).
Communicate decisions, questions, and potential concerns.
- [ ] Explains differences from previous plans (e.g., issue description).
- [ ] Highlights technical choices and bugs encountered.
- [ ] Calls out remaining decisions and concerns.
- [ ] Automated tests verify logic where appropriate.
Individual commits are ready for review (see [commit discipline](https://zulip.readthedocs.io/en/latest/contributing/commit-discipline.html)).
- [ ] Each commit is a coherent idea.
- [ ] Commit message(s) explain reasoning and motivation for changes.
Completed manual review and testing of the following:
- [ ] Visual appearance of the changes.
- [ ] Responsiveness and internationalization.
- [ ] Strings and tooltips.
- [ ] End-to-end functionality of buttons, interactions and flows.
- [ ] Corner cases, error conditions, and easily imagined bugs.
</details>
<!-- Also be sure to make clear, coherent commits:
https://zulip.readthedocs.io/en/latest/contributing/version-control.html
-->

View File

@@ -0,0 +1,43 @@
name: Cancel previous runs
on: [push, pull_request]
defaults:
run:
shell: bash
jobs:
cancel:
name: Cancel previous runs
runs-on: ubuntu-latest
timeout-minutes: 3
# Don't run this job for zulip/zulip pushes since we
# want to run those jobs.
if: ${{ github.event_name != 'push' || github.event.repository.full_name != 'zulip/zulip' }}
steps:
# We get workflow IDs from GitHub API so we don't have to maintain
# a hard-coded list of IDs which need to be updated when a workflow
# is added or removed. And, workflow IDs are different for other forks
# so this is required.
- name: Get workflow IDs.
id: workflow_ids
continue-on-error: true # Don't fail this job on failure
env:
# This is in <owner>/<repo> format e.g. zulip/zulip
REPOSITORY: ${{ github.repository }}
run: |
workflow_api_url=https://api.github.com/repos/$REPOSITORY/actions/workflows
curl -fL $workflow_api_url -o workflows.json
script="const {workflows} = require('./workflows'); \
const ids = workflows.map(workflow => workflow.id); \
console.log(ids.join(','));"
ids=$(node -e "$script")
echo "::set-output name=ids::$ids"
- uses: styfle/cancel-workflow-action@0.9.0
continue-on-error: true # Don't fail this job on failure
with:
workflow_id: ${{ steps.workflow_ids.outputs.ids }}
access_token: ${{ github.token }}

View File

@@ -2,39 +2,26 @@ name: "Code scanning"
on:
push:
branches: ["*.x", chat.zulip.org, main]
tags: ["*"]
pull_request:
branches: ["*.x", chat.zulip.org, main]
workflow_dispatch:
concurrency:
group: "${{ github.workflow }}-${{ github.head_ref || github.run_id }}"
cancel-in-progress: true
permissions:
contents: read
branches-ignore:
- dependabot/** # https://github.com/github/codeql-action/pull/435
pull_request: {}
jobs:
CodeQL:
permissions:
actions: read # for github/codeql-action/init to get workflow details
contents: read # for actions/checkout to fetch code
security-events: write # for github/codeql-action/analyze to upload SARIF results
if: ${{!github.event.repository.private}}
runs-on: ubuntu-latest
steps:
- name: Check out repository
uses: actions/checkout@v3
uses: actions/checkout@v2
# Initializes the CodeQL tools for scanning.
- name: Initialize CodeQL
uses: github/codeql-action/init@v2
uses: github/codeql-action/init@v1
# Override language selection by uncommenting this and choosing your languages
# with:
# languages: go, javascript, csharp, python, cpp, java
- name: Perform CodeQL Analysis
uses: github/codeql-action/analyze@v2
uses: github/codeql-action/analyze@v1

View File

@@ -1,52 +1,42 @@
name: Zulip production suite
on:
push:
branches: ["*.x", chat.zulip.org, main]
tags: ["*"]
push: {}
pull_request:
paths:
- .github/workflows/production-suite.yml
- "**/migrations/**"
- babel.config.js
- manage.py
- pnpm-lock.yaml
- postcss.config.js
- puppet/**
- requirements/**
- scripts/**
- static/assets/**
- static/third/**
- tools/**
- web/babel.config.js
- web/postcss.config.js
- web/third/**
- web/webpack.config.ts
- webpack.config.ts
- yarn.lock
- zerver/worker/queue_processors.py
- zerver/lib/push_notifications.py
- zerver/decorator.py
- zproject/**
workflow_dispatch:
concurrency:
group: "${{ github.workflow }}-${{ github.head_ref || github.run_id }}"
cancel-in-progress: true
defaults:
run:
shell: bash
permissions:
contents: read
jobs:
production_build:
# This job builds a release tarball from the current commit, which
# will be used for all of the following install/upgrade tests.
name: Ubuntu 20.04 production build
name: Debian 10 production build
runs-on: ubuntu-latest
# Docker images are built from 'tools/ci/Dockerfile'; the comments at
# the top explain how to build and upload these images.
# Ubuntu 20.04 ships with Python 3.8.10.
container: zulip/ci:focal
# Debian 10 ships with Python 3.7.3.
container: zulip/ci:buster
steps:
- name: Add required permissions
run: |
@@ -64,60 +54,50 @@ jobs:
# cache action to work. It is owned by root currently.
sudo chmod -R 0777 /__w/_temp/
- uses: actions/checkout@v3
- uses: actions/checkout@v2
- name: Create cache directories
run: |
dirs=(/srv/zulip-{venv,emoji}-cache)
dirs=(/srv/zulip-{npm,venv,emoji}-cache)
sudo mkdir -p "${dirs[@]}"
sudo chown -R github "${dirs[@]}"
- name: Restore pnpm store
uses: actions/cache@v3
- name: Restore node_modules cache
uses: actions/cache@v2
with:
path: ~/.local/share/pnpm/store
key: v1-pnpm-store-focal-${{ hashFiles('pnpm-lock.yaml') }}
path: /srv/zulip-npm-cache
key: v1-yarn-deps-buster-${{ hashFiles('package.json') }}-${{ hashFiles('yarn.lock') }}
restore-keys: v1-yarn-deps-buster
- name: Restore python cache
uses: actions/cache@v3
uses: actions/cache@v2
with:
path: /srv/zulip-venv-cache
key: v1-venv-focal-${{ hashFiles('requirements/dev.txt') }}
restore-keys: v1-venv-focal
key: v1-venv-buster-${{ hashFiles('requirements/dev.txt') }}
restore-keys: v1-venv-buster
- name: Restore emoji cache
uses: actions/cache@v3
uses: actions/cache@v2
with:
path: /srv/zulip-emoji-cache
key: v1-emoji-focal-${{ hashFiles('tools/setup/emoji/emoji_map.json') }}-${{ hashFiles('tools/setup/emoji/build_emoji') }}-${{ hashFiles('tools/setup/emoji/emoji_setup_utils.py') }}-${{ hashFiles('tools/setup/emoji/emoji_names.py') }}-${{ hashFiles('package.json') }}
restore-keys: v1-emoji-focal
key: v1-emoji-buster-${{ hashFiles('tools/setup/emoji/emoji_map.json') }}-${{ hashFiles('tools/setup/emoji/build_emoji') }}-${{ hashFiles('tools/setup/emoji/emoji_setup_utils.py') }}-${{ hashFiles('tools/setup/emoji/emoji_names.py') }}-${{ hashFiles('package.json') }}
restore-keys: v1-emoji-buster
- name: Build production tarball
run: ./tools/ci/production-build
- name: Upload production build artifacts for install jobs
uses: actions/upload-artifact@v3
uses: actions/upload-artifact@v2
with:
name: production-tarball
path: /tmp/production-build
retention-days: 1
retention-days: 14
- name: Generate failure report string
id: failure_report_string
if: ${{ failure() && github.repository == 'zulip/zulip' && github.event_name == 'push' }}
run: tools/ci/generate-failure-message >> $GITHUB_OUTPUT
- name: Report status to CZO
if: ${{ failure() && github.repository == 'zulip/zulip' && github.event_name == 'push' }}
uses: zulip/github-actions-zulip/send-message@v1
with:
api-key: ${{ secrets.ZULIP_BOT_KEY }}
email: "github-actions-bot@chat.zulip.org"
organization-url: "https://chat.zulip.org"
to: "automated testing"
topic: ${{ steps.failure_report_string.outputs.topic }}
type: "stream"
content: ${{ steps.failure_report_string.outputs.content }}
- name: Report status
if: failure()
env:
ZULIP_BOT_KEY: ${{ secrets.ZULIP_BOT_KEY }}
run: tools/ci/send-failure-message
production_install:
# This job installs the server release tarball built above on a
@@ -130,24 +110,19 @@ jobs:
# Docker images are built from 'tools/ci/Dockerfile'; the comments at
# the top explain how to build and upload these images.
- docker_image: zulip/ci:focal
name: Ubuntu 20.04 production install and PostgreSQL upgrade with pgroonga
name: Ubuntu 20.04 production install
os: focal
extra-args: ""
extra_args: ""
- docker_image: zulip/ci:jammy
name: Ubuntu 22.04 production install
os: jammy
extra-args: ""
- docker_image: zulip/ci:buster
name: Debian 10 production install with custom db name and user
os: buster
extra_args: --test-custom-db
- docker_image: zulip/ci:bullseye
name: Debian 11 production install with custom db name and user
name: Debian 11 production install
os: bullseye
extra-args: --test-custom-db
- docker_image: zulip/ci:bookworm
name: Debian 12 production install
os: bookworm
extra-args: ""
extra_args: ""
name: ${{ matrix.name }}
container:
@@ -158,7 +133,7 @@ jobs:
steps:
- name: Download built production tarball
uses: actions/download-artifact@v3
uses: actions/download-artifact@v2
with:
name: production-tarball
path: /tmp
@@ -176,22 +151,25 @@ jobs:
chmod +x /tmp/production-pgroonga
chmod +x /tmp/production-install
chmod +x /tmp/production-verify
chmod +x /tmp/generate-failure-message
chmod +x /tmp/send-failure-message
- name: Create cache directories
run: |
dirs=(/srv/zulip-{venv,emoji}-cache)
dirs=(/srv/zulip-{npm,venv,emoji}-cache)
sudo mkdir -p "${dirs[@]}"
sudo chown -R github "${dirs[@]}"
- name: Restore pnpm store
uses: actions/cache@v3
- name: Restore node_modules cache
uses: actions/cache@v2
with:
path: ~/.local/share/pnpm/store
key: v1-pnpm-store-${{ matrix.os }}-${{ hashFiles('/tmp/pnpm-lock.yaml') }}
path: /srv/zulip-npm-cache
key: v1-yarn-deps-${{ matrix.os }}-${{ hashFiles('/tmp/package.json') }}-${{ hashFiles('/tmp/yarn.lock') }}
restore-keys: v1-yarn-deps-${{ matrix.os }}
- name: Install production
run: sudo /tmp/production-install ${{ matrix.extra-args }}
run: |
sudo service rabbitmq-server restart
sudo /tmp/production-install ${{ matrix.extra-args }}
- name: Verify install
run: sudo /tmp/production-verify ${{ matrix.extra-args }}
@@ -212,22 +190,11 @@ jobs:
if: ${{ matrix.os == 'focal' }}
run: sudo /tmp/production-verify ${{ matrix.extra-args }}
- name: Generate failure report string
id: failure_report_string
if: ${{ failure() && github.repository == 'zulip/zulip' && github.event_name == 'push' }}
run: /tmp/generate-failure-message >> $GITHUB_OUTPUT
- name: Report status to CZO
if: ${{ failure() && github.repository == 'zulip/zulip' && github.event_name == 'push' }}
uses: zulip/github-actions-zulip/send-message@v1
with:
api-key: ${{ secrets.ZULIP_BOT_KEY }}
email: "github-actions-bot@chat.zulip.org"
organization-url: "https://chat.zulip.org"
to: "automated testing"
topic: ${{ steps.failure_report_string.outputs.topic }}
type: "stream"
content: ${{ steps.failure_report_string.outputs.content }}
- name: Report status
if: failure()
env:
ZULIP_BOT_KEY: ${{ secrets.ZULIP_BOT_KEY }}
run: /tmp/send-failure-message
production_upgrade:
# The production upgrade job starts with a container with a
@@ -240,19 +207,14 @@ jobs:
fail-fast: false
matrix:
include:
# Docker images are built from 'tools/ci/Dockerfile.prod'; the comments at
# Docker images are built from 'tools/ci/Dockerfile'; the comments at
# the top explain how to build and upload these images.
- docker_image: zulip/ci:focal-3.2
name: 3.2 Version Upgrade
os: focal
- docker_image: zulip/ci:bullseye-4.2
name: 4.2 Version Upgrade
os: bullseye
- docker_image: zulip/ci:bullseye-5.0
name: 5.0 Version Upgrade
os: bullseye
- docker_image: zulip/ci:bullseye-6.0
name: 6.0 Version Upgrade
- docker_image: zulip/ci:buster-3.4
name: 3.4 Version Upgrade
os: buster
- docker_image: zulip/ci:bullseye-4.11
name: 4.11 Version Upgrade
os: bullseye
name: ${{ matrix.name }}
@@ -264,7 +226,7 @@ jobs:
steps:
- name: Download built production tarball
uses: actions/download-artifact@v3
uses: actions/download-artifact@v2
with:
name: production-tarball
path: /tmp
@@ -280,11 +242,11 @@ jobs:
# of the tarball uploaded by the upload artifact fix those.
chmod +x /tmp/production-upgrade
chmod +x /tmp/production-verify
chmod +x /tmp/generate-failure-message
chmod +x /tmp/send-failure-message
- name: Create cache directories
run: |
dirs=(/srv/zulip-{venv,emoji}-cache)
dirs=(/srv/zulip-{npm,venv,emoji}-cache)
sudo mkdir -p "${dirs[@]}"
sudo chown -R github "${dirs[@]}"
@@ -297,19 +259,8 @@ jobs:
# - name: Verify install
# run: sudo /tmp/production-verify
- name: Generate failure report string
id: failure_report_string
if: ${{ failure() && github.repository == 'zulip/zulip' && github.event_name == 'push' }}
run: /tmp/generate-failure-message >> $GITHUB_OUTPUT
- name: Report status to CZO
if: ${{ failure() && github.repository == 'zulip/zulip' && github.event_name == 'push' }}
uses: zulip/github-actions-zulip/send-message@v1
with:
api-key: ${{ secrets.ZULIP_BOT_KEY }}
email: "github-actions-bot@chat.zulip.org"
organization-url: "https://chat.zulip.org"
to: "automated testing"
topic: ${{ steps.failure_report_string.outputs.topic }}
type: "stream"
content: ${{ steps.failure_report_string.outputs.content }}
- name: Report status
if: failure()
env:
ZULIP_BOT_KEY: ${{ secrets.ZULIP_BOT_KEY }}
run: /tmp/send-failure-message

View File

@@ -2,14 +2,11 @@ name: Update one click apps
on:
release:
types: [published]
permissions:
contents: read
jobs:
update-digitalocean-oneclick-app:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v3
- uses: actions/checkout@v2
- name: Update DigitalOcean one click app
env:
DIGITALOCEAN_API_KEY: ${{ secrets.ONE_CLICK_ACTION_DIGITALOCEAN_API_KEY }}
@@ -22,6 +19,6 @@ jobs:
run: |
export PATH="$HOME/.local/bin:$PATH"
git clone https://github.com/zulip/marketplace-partners
pip3 install python-digitalocean zulip fab-classic PyNaCl
pip3 install python-digitalocean zulip fab-classic
echo $PATH
python3 tools/oneclickapps/prepare_digital_ocean_one_click_app_release.py

View File

@@ -4,24 +4,12 @@
name: Zulip CI
on:
push:
branches: ["*.x", chat.zulip.org, main]
tags: ["*"]
pull_request:
workflow_dispatch:
concurrency:
group: "${{ github.workflow }}-${{ github.head_ref || github.run_id }}"
cancel-in-progress: true
on: [push, pull_request]
defaults:
run:
shell: bash
permissions:
contents: read
jobs:
tests:
strategy:
@@ -30,29 +18,20 @@ jobs:
include:
# Base images are built using `tools/ci/Dockerfile.prod.template`.
# The comments at the top explain how to build and upload these images.
# Ubuntu 20.04 ships with Python 3.8.10.
- docker_image: zulip/ci:focal
name: Ubuntu 20.04 (Python 3.8, backend + frontend)
os: focal
include_documentation_tests: false
# Debian 10 ships with Python 3.7.3.
- docker_image: zulip/ci:buster
name: Debian 10 Buster (Python 3.7, backend + frontend)
os: buster
include_frontend_tests: true
# Ubuntu 20.04 ships with Python 3.8.2.
- docker_image: zulip/ci:focal
name: Ubuntu 20.04 Focal (Python 3.8, backend)
os: focal
include_frontend_tests: false
# Debian 11 ships with Python 3.9.2.
- docker_image: zulip/ci:bullseye
name: Debian 11 (Python 3.9, backend + documentation)
name: Debian 11 Bullseye (Python 3.9, backend)
os: bullseye
include_documentation_tests: true
include_frontend_tests: false
# Ubuntu 22.04 ships with Python 3.10.4.
- docker_image: zulip/ci:jammy
name: Ubuntu 22.04 (Python 3.10, backend)
os: jammy
include_documentation_tests: false
include_frontend_tests: false
# Debian 12 ships with Python 3.11.2.
- docker_image: zulip/ci:bookworm
name: Debian 12 (Python 3.11, backend)
os: bookworm
include_documentation_tests: false
include_frontend_tests: false
runs-on: ubuntu-latest
@@ -68,32 +47,33 @@ jobs:
HOME: /home/github/
steps:
- uses: actions/checkout@v3
- uses: actions/checkout@v2
- name: Create cache directories
run: |
dirs=(/srv/zulip-{venv,emoji}-cache)
dirs=(/srv/zulip-{npm,venv,emoji}-cache)
sudo mkdir -p "${dirs[@]}"
sudo chown -R github "${dirs[@]}"
- name: Restore pnpm store
uses: actions/cache@v3
- name: Restore node_modules cache
uses: actions/cache@v2
with:
path: ~/.local/share/pnpm/store
key: v1-pnpm-store-${{ matrix.os }}-${{ hashFiles('pnpm-lock.yaml') }}
path: /srv/zulip-npm-cache
key: v1-yarn-deps-${{ matrix.os }}-${{ hashFiles('package.json') }}-${{ hashFiles('yarn.lock') }}
restore-keys: v1-yarn-deps-${{ matrix.os }}
- name: Restore python cache
uses: actions/cache@v3
uses: actions/cache@v2
with:
path: /srv/zulip-venv-cache
key: v1-venv-${{ matrix.os }}-${{ hashFiles('requirements/dev.txt') }}
restore-keys: v1-venv-${{ matrix.os }}
- name: Restore emoji cache
uses: actions/cache@v3
uses: actions/cache@v2
with:
path: /srv/zulip-emoji-cache
key: v1-emoji-${{ matrix.os }}-${{ hashFiles('tools/setup/emoji/emoji_map.json', 'tools/setup/emoji/build_emoji', 'tools/setup/emoji/emoji_setup_utils.py', 'tools/setup/emoji/emoji_names.py', 'package.json') }}
key: v1-emoji-${{ matrix.os }}-${{ hashFiles('tools/setup/emoji/emoji_map.json') }}-${{ hashFiles('tools/setup/emoji/build_emoji') }}-${{ hashFiles('tools/setup/emoji/emoji_setup_utils.py') }}-${{ hashFiles('tools/setup/emoji/emoji_names.py') }}-${{ hashFiles('package.json') }}
restore-keys: v1-emoji-${{ matrix.os }}
- name: Install dependencies
@@ -130,7 +110,7 @@ jobs:
- name: Run backend tests
run: |
source tools/ci/activate-venv
./tools/test-backend --coverage --xml-report --no-html-report --include-webhooks --no-cov-cleanup --ban-console-output
./tools/test-backend --coverage --include-webhooks --no-cov-cleanup --ban-console-output
- name: Run mypy
run: |
@@ -144,9 +124,8 @@ jobs:
run: |
source tools/ci/activate-venv
# Currently our compiled requirements files will differ for different
# Python versions, so we will run test-locked-requirements only on the
# platform with the oldest one.
# Currently our compiled requirements files will differ for different python versions
# so we will run test-locked-requirements only for Debian 10.
# ./tools/test-locked-requirements
# ./tools/test-run-dev # https://github.com/zulip/zulip/pull/14233
#
@@ -157,17 +136,15 @@ jobs:
./tools/test-migrations
./tools/setup/optimize-svg --check
./tools/setup/generate_integration_bots_avatars.py --check-missing
./tools/ci/check-executables
# Ban check-database-compatibility from transitively
# Ban check-database-compatibility.py from transitively
# relying on static/generated, because it might not be
# up-to-date at that point in upgrade-zulip-stage-2.
chmod 000 static/generated web/generated
./scripts/lib/check-database-compatibility
chmod 755 static/generated web/generated
chmod 000 static/generated
./scripts/lib/check-database-compatibility.py
chmod 755 static/generated
- name: Run documentation and api tests
if: ${{ matrix.include_documentation_tests }}
run: |
source tools/ci/activate-venv
# In CI, we only test links we control in test-documentation to avoid flakes
@@ -203,10 +180,6 @@ jobs:
source tools/ci/activate-venv
./tools/test-js-with-puppeteer
- name: Check pnpm dedupe
if: ${{ matrix.include_frontend_tests }}
run: pnpm dedupe --check
- name: Check for untracked files
run: |
source tools/ci/activate-venv
@@ -219,7 +192,7 @@ jobs:
fi
- name: Test locked requirements
if: ${{ matrix.os == 'focal' }}
if: ${{ matrix.os == 'buster' }}
run: |
. /srv/zulip-py3-venv/bin/activate && \
./tools/test-locked-requirements
@@ -229,35 +202,25 @@ jobs:
# Only upload coverage when both frontend and backend
# tests are run.
if: ${{ matrix.include_frontend_tests }}
uses: codecov/codecov-action@v3
uses: codecov/codecov-action@v2
with:
files: var/coverage.xml,var/node-coverage/lcov.info
- name: Store Puppeteer artifacts
# Upload these on failure, as well
if: ${{ always() && matrix.include_frontend_tests }}
uses: actions/upload-artifact@v3
uses: actions/upload-artifact@v2
with:
name: puppeteer
path: ./var/puppeteer
retention-days: 60
- name: Check development database build
if: ${{ matrix.os == 'focal' || matrix.os == 'bullseye' || matrix.os == 'jammy' }}
run: ./tools/ci/setup-backend
- name: Generate failure report string
id: failure_report_string
if: ${{ failure() && github.repository == 'zulip/zulip' && github.event_name == 'push' }}
run: tools/ci/generate-failure-message >> $GITHUB_OUTPUT
- name: Report status to CZO
if: ${{ failure() && github.repository == 'zulip/zulip' && github.event_name == 'push' }}
uses: zulip/github-actions-zulip/send-message@v1
with:
api-key: ${{ secrets.ZULIP_BOT_KEY }}
email: "github-actions-bot@chat.zulip.org"
organization-url: "https://chat.zulip.org"
to: "automated testing"
topic: ${{ steps.failure_report_string.outputs.topic }}
type: "stream"
content: ${{ steps.failure_report_string.outputs.content }}
- name: Report status
if: failure()
env:
ZULIP_BOT_KEY: ${{ secrets.ZULIP_BOT_KEY }}
run: tools/ci/send-failure-message

3
.gitignore vendored
View File

@@ -33,7 +33,6 @@ package-lock.json
!/var/puppeteer/test_credentials.d.ts
/.dmypy.json
/.ruff_cache
# Generated i18n data
/locale/en
@@ -44,11 +43,11 @@ package-lock.json
# Static build
*.mo
npm-debug.log
/.pnpm-store
/node_modules
/prod-static
/staticfiles.json
/webpack-stats-production.json
/yarn-error.log
zulip-git-version
# Test / analysis tools

View File

@@ -1,13 +1,13 @@
[general]
ignore=title-trailing-punctuation, body-min-length, body-is-missing
extra-path=tools/lib/gitlint_rules.py
extra-path=tools/lib/gitlint-rules.py
[title-match-regex]
regex=^(.+:\ )?[A-Z].+\.$
[title-max-length]
line-length=72
line-length=76
[body-max-line-length]
line-length=76

View File

@@ -12,120 +12,62 @@
# # shows raw names/emails, filtered by mapped name:
# $ git log --format='%an %ae' --author=$NAME | uniq -c
acrefoot <acrefoot@zulip.com> <acrefoot@humbughq.com>
acrefoot <acrefoot@zulip.com> <acrefoot@dropbox.com>
acrefoot <acrefoot@zulip.com> <acrefoot@alum.mit.edu>
Adam Benesh <Adam.Benesh@gmail.com> <Adam-Daniel.Benesh@t-systems.com>
Adam Benesh <Adam.Benesh@gmail.com>
Adarsh Tiwari <xoldyckk@gmail.com>
Alex Vandiver <alexmv@zulip.com> <alex@chmrr.net>
Alex Vandiver <alexmv@zulip.com> <github@chmrr.net>
Allen Rabinovich <allenrabinovich@yahoo.com> <allenr@humbughq.com>
Allen Rabinovich <allenrabinovich@yahoo.com> <allenr@zulip.com>
Alya Abbott <alya@zulip.com> <2090066+alya@users.noreply.github.com>
Aman Agrawal <amanagr@zulip.com> <f2016561@pilani.bits-pilani.ac.in>
Aman Agrawal <amanagr@zulip.com>
Anders Kaseorg <anders@zulip.com> <anders@zulipchat.com>
Anders Kaseorg <anders@zulip.com> <andersk@mit.edu>
Aryan Shridhar <aryanshridhar7@gmail.com> <53977614+aryanshridhar@users.noreply.github.com>
Aryan Shridhar <aryanshridhar7@gmail.com>
aparna-bhatt <aparnabhatt2001@gmail.com> <86338542+aparna-bhatt@users.noreply.github.com>
Ashwat Kumar Singh <ashwat.kumarsingh.met20@itbhu.ac.in>
Austin Riba <austin@zulip.com> <austin@m51.io>
BIKI DAS <bikid475@gmail.com>
Brijmohan Siyag <brijsiyag@gmail.com>
Brock Whittaker <brock@zulipchat.com> <bjwhitta@asu.edu>
Brock Whittaker <brock@zulipchat.com> <brockwhittaker@Brocks-MacBook.local>
Brock Whittaker <brock@zulipchat.com> <brock@zulipchat.org>
Chris Bobbe <cbobbe@zulip.com> <cbobbe@zulipchat.com>
Chris Bobbe <cbobbe@zulip.com> <csbobbe@gmail.com>
Danny Su <contact@dannysu.com> <opensource@emailengine.org>
Dinesh <chdinesh1089@gmail.com>
Dinesh <chdinesh1089@gmail.com> <chdinesh1089>
Eeshan Garg <eeshan@zulip.com> <jerryguitarist@gmail.com>
Eric Smith <erwsmith@gmail.com> <99841919+erwsmith@users.noreply.github.com>
Evy Kassirer <evy.kassirer@gmail.com>
Evy Kassirer <evy.kassirer@gmail.com> <evykassirer@users.noreply.github.com>
Ganesh Pawar <pawarg256@gmail.com> <58626718+ganpa3@users.noreply.github.com>
Greg Price <greg@zulip.com> <gnprice@gmail.com>
Greg Price <greg@zulip.com> <greg@zulipchat.com>
Greg Price <greg@zulip.com> <price@mit.edu>
Hardik Dharmani <Ddharmani99@gmail.com> <ddharmani99@gmail.com>
Hemant Umre <hemantumre12@gmail.com> <87542880+HemantUmre12@users.noreply.github.com>
Jai soni <jai_s@me.iitr.ac.in>
Jai soni <jai_s@me.iitr.ac.in> <76561593+jai2201@users.noreply.github.com>
Jeff Arnold <jbarnold@gmail.com> <jbarnold@humbughq.com>
Jeff Arnold <jbarnold@gmail.com> <jbarnold@zulip.com>
Jessica McKellar <jesstess@mit.edu> <jesstess@humbughq.com>
Jessica McKellar <jesstess@mit.edu> <jesstess@zulip.com>
Julia Bichler <julia.bichler@tum.de> <74348920+juliaBichler01@users.noreply.github.com>
Karl Stolley <karl@zulip.com> <karl@stolley.dev>
Kevin Mehall <km@kevinmehall.net> <kevin@humbughq.com>
Kevin Mehall <km@kevinmehall.net> <kevin@zulip.com>
Kevin Scott <kevin.scott.98@gmail.com>
Lalit Kumar Singh <lalitkumarsingh3716@gmail.com>
Lauryn Menard <lauryn@zulip.com> <lauryn.menard@gmail.com>
Lauryn Menard <lauryn@zulip.com> <63245456+laurynmm@users.noreply.github.com>
Mateusz Mandera <mateusz.mandera@zulip.com> <mateusz.mandera@protonmail.com>
Matt Keller <matt@zulip.com>
Matt Keller <matt@zulip.com> <m@cognusion.com>
m-e-l-u-h-a-n <purushottam.tiwari.cd.cse19@itbhu.ac.in>
m-e-l-u-h-a-n <purushottam.tiwari.cd.cse19@itbhu.ac.in> <pururshottam.tiwari.cd.cse19@itbhu.ac.in>
Noble Mittal <noblemittal@outlook.com> <62551163+beingnoble03@users.noreply.github.com>
nzai <nzaih18@gmail.com> <70953556+nzaih1999@users.noreply.github.com>
Palash Baderia <palash.baderia@outlook.com>
Palash Baderia <palash.baderia@outlook.com> <66828942+palashb01@users.noreply.github.com>
Palash Raghuwanshi <singhpalash0@gmail.com>
Parth <mittalparth22@gmail.com>
Priyam Seth <sethpriyam1@gmail.com> <b19188@students.iitmandi.ac.in>
Ray Kraesig <rkraesig@zulip.com> <rkraesig@zulipchat.com>
Reid Barton <rwbarton@gmail.com> <rwbarton@humbughq.com>
Rein Zustand (rht) <rhtbot@protonmail.com>
Rishi Gupta <rishig@zulipchat.com> <rishig+git@mit.edu>
Rishi Gupta <rishig@zulipchat.com> <rishig@kandralabs.com>
Rishi Gupta <rishig@zulipchat.com> <rishig@users.noreply.github.com>
Rishabh Maheshwari <b20063@students.iitmandi.ac.in>
Rixant Rokaha <rixantrokaha@gmail.com>
Rixant Rokaha <rixantrokaha@gmail.com> <rishantrokaha@gmail.com>
Rixant Rokaha <rixantrokaha@gmail.com> <rrokaha@caldwell.edu>
Sahil Batra <sahil@zulip.com> <35494118+sahil839@users.noreply.github.com>
Sahil Batra <sahil@zulip.com> <sahilbatra839@gmail.com>
Satyam Bansal <sbansal1999@gmail.com>
Reid Barton <rwbarton@gmail.com> <rwbarton@humbughq.com>
Sayam Samal <samal.sayam@gmail.com>
Scott Feeney <scott@oceanbase.org> <scott@humbughq.com>
Scott Feeney <scott@oceanbase.org> <scott@zulip.com>
Shlok Patel <shlokcpatel2001@gmail.com>
Somesh Ranjan <somesh.ranjan.met20@itbhu.ac.in> <77766761+somesh202@users.noreply.github.com>
Steve Howell <showell@zulip.com> <showell30@yahoo.com>
Steve Howell <showell@zulip.com> <showell@yahoo.com>
Steve Howell <showell@zulip.com> <showell@zulipchat.com>
Steve Howell <showell@zulip.com> <steve@humbughq.com>
Steve Howell <showell@zulip.com> <steve@zulip.com>
strifel <info@strifel.de>
Tim Abbott <tabbott@zulip.com>
Tim Abbott <tabbott@zulip.com> <tabbott@dropbox.com>
Tim Abbott <tabbott@zulip.com> <tabbott@humbughq.com>
Tim Abbott <tabbott@zulip.com> <tabbott@mit.edu>
Tim Abbott <tabbott@zulip.com> <tabbott@zulipchat.com>
Ujjawal Modi <umodi2003@gmail.com> <99073049+Ujjawal3@users.noreply.github.com>
Vishnu KS <vishnu@zulip.com> <hackerkid@vishnuks.com>
Vishnu KS <vishnu@zulip.com> <yo@vishnuks.com>
Alya Abbott <alya@zulip.com> <alyaabbott@elance-odesk.com>
umkay <ukhan@zulipchat.com> <umaimah.k@gmail.com>
umkay <ukhan@zulipchat.com> <umkay@users.noreply.github.com>
Waseem Daher <wdaher@zulip.com> <wdaher@humbughq.com>
Waseem Daher <wdaher@zulip.com> <wdaher@dropbox.com>
Sahil Batra <sahil@zulip.com> <sahilbatra839@gmail.com>
Yash RE <33805964+YashRE42@users.noreply.github.com> <YashRE42@github.com>
Yash RE <33805964+YashRE42@users.noreply.github.com>
Yogesh Sirsat <yogeshsirsat56@gmail.com>
Yogesh Sirsat <yogeshsirsat56@gmail.com> <41695888+yogesh-sirsat@users.noreply.github.com>
Zeeshan Equbal <equbalzeeshan@gmail.com> <54993043+zee-bit@users.noreply.github.com>
Zeeshan Equbal <equbalzeeshan@gmail.com>
Zev Benjamin <zev@zulip.com> <zev@dropbox.com>
Zev Benjamin <zev@zulip.com> <zev@humbughq.com>
Zev Benjamin <zev@zulip.com> <zev@mit.edu>
Zixuan James Li <p359101898@gmail.com>
Zixuan James Li <p359101898@gmail.com> <39874143+PIG208@users.noreply.github.com>
Zixuan James Li <p359101898@gmail.com> <359101898@qq.com>
Joseph Ho <josephho678@gmail.com>
Joseph Ho <josephho678@gmail.com> <62449508+Joelute@users.noreply.github.com>

1
.npmrc
View File

@@ -1 +0,0 @@
ignore-dep-scripts=true

View File

@@ -1,11 +1,8 @@
pnpm-lock.yaml
/api_docs/**/*.md
/corporate/tests/stripe_fixtures
/help/**/*.md
/locale
/static/third
/templates/**/*.md
/tools/setup/emoji/emoji_map.json
/web/third
/zerver/tests/fixtures
/zerver/webhooks/*/doc.md
/zerver/webhooks/*/fixtures

View File

@@ -1,15 +0,0 @@
# https://docs.readthedocs.io/en/stable/config-file/v2.html
version: 2
build:
os: ubuntu-22.04
tools:
python: "3.10"
sphinx:
configuration: docs/conf.py
fail_on_warning: true
python:
install:
- requirements: requirements/docs.txt

View File

@@ -1,39 +1,32 @@
# Migrated from transifex-client format with `tx migrate`
#
# See https://developers.transifex.com/docs/using-the-client which hints at
# this format, but in general, the headings are in the format of:
#
# [o:<org>:p:<project>:r:<resource>]
[main]
host = https://www.transifex.com
lang_map = zh-Hans: zh_Hans, zh-Hant: zh_Hant
[o:zulip:p:zulip:r:djangopo]
[zulip.djangopo]
file_filter = locale/<lang>/LC_MESSAGES/django.po
source_file = locale/en/LC_MESSAGES/django.po
source_lang = en
type = PO
[o:zulip:p:zulip:r:mobile]
[zulip.translationsjson]
file_filter = locale/<lang>/translations.json
source_file = locale/en/translations.json
source_lang = en
type = KEYVALUEJSON
[zulip.mobile]
file_filter = locale/<lang>/mobile.json
source_file = locale/en/mobile.json
source_lang = en
type = KEYVALUEJSON
[o:zulip:p:zulip:r:translationsjson]
file_filter = locale/<lang>/translations.json
source_file = locale/en/translations.json
source_lang = en
type = KEYVALUEJSON
[o:zulip:p:zulip-test:r:djangopo]
[zulip-test.djangopo]
file_filter = locale/<lang>/LC_MESSAGES/django.po
source_file = locale/en/LC_MESSAGES/django.po
source_lang = en
type = PO
[o:zulip:p:zulip-test:r:translationsjson]
[zulip-test.translationsjson]
file_filter = locale/<lang>/translations.json
source_file = locale/en/translations.json
source_lang = en

1
.yarnrc Normal file
View File

@@ -0,0 +1 @@
ignore-scripts true

View File

@@ -102,71 +102,3 @@ This Code of Conduct is adapted from the
under a
[Creative Commons BY-SA](https://creativecommons.org/licenses/by-sa/4.0/)
license.
## Moderating the Zulip community
Anyone can help moderate the Zulip community by helping make sure that folks are
aware of the [community guidelines](https://zulip.com/development-community/)
and this Code of Conduct, and that we maintain a positive and respectful
atmosphere.
Here are some guidelines for you how can help:
- Be friendly! Welcoming folks, thanking them for their feedback, ideas and effort,
and just trying to keep the atmosphere warm make the whole community function
more smoothly. New participants who feel accepted, listened to and respected
are likely to treat others the same way.
- Be familiar with the [community
guidelines](https://zulip.com/development-community/), and cite them liberally
when a user violates them. Be polite but firm. Some examples:
- @user please note that there is no need to @-mention @\_**Tim Abbott** when
you ask a question. As noted in the [guidelines for this
community](https://zulip.com/development-community/):
> Use @-mentions sparingly… there is generally no need to @-mention a
> core contributor unless you need their timely attention.
- @user, please keep in mind the following [community
guideline](https://zulip.com/development-community/):
> Dont ask the same question in multiple places. Moderators read every
> public stream, and make sure every question gets a reply.
Ive gone ahead and moved the other copy of this message to this thread.
- If asked a question in a PM that is better discussed in a public stream:
> Hi @user! Please start by reviewing
> https://zulip.com/development-community/#community-norms to learn how to
> get help in this community.
- Users sometimes think chat.zulip.org is a testing instance. When this happens,
kindly direct them to use the **#test here** stream.
- If you see a message thats posted in the wrong place, go ahead and move it if
you have permissions to do so, even if you dont plan to respond to it.
Leaving the “Send automated notice to new topic” option enabled helps make it
clear what happened to the person who sent the message.
If you are responding to a message that's been moved, mention the user in your
reply, so that the mention serves as a notification of the new location for
their conversation.
- If a user is posting spam, please report it to an administrator. They will:
- Change the user's name to `<name> (spammer)` and deactivate them.
- Delete any spam messages they posted in public streams.
- We care very much about maintaining a respectful tone in our community. If you
see someone being mean or rude, point out that their tone is inappropriate,
and ask them to communicate their perspective in a respectful way in the
future. If you dont feel comfortable doing so yourself, feel free to ask a
member of Zulip's core team to take care of the situation.
- Try to assume the best intentions from others (given the range of
possibilities presented by their visible behavior), and stick with a friendly
and positive tone even when someones behavior is poor or disrespectful.
Everyone has bad days and stressful situations that can result in them
behaving not their best, and while we should be firm about our community
rules, we should also enforce them with kindness.

View File

@@ -1,36 +1,17 @@
# Contributing guide
# Contributing to Zulip
Welcome to the Zulip community!
## Zulip development community
## Community
The primary communication forum for the Zulip community is the Zulip
server hosted at [chat.zulip.org](https://chat.zulip.org/):
- **Users** and **administrators** of Zulip organizations stop by to
ask questions, offer feedback, and participate in product design
discussions.
- **Contributors to the project**, including the **core Zulip
development team**, discuss ongoing and future projects, brainstorm
ideas, and generally help each other out.
Everyone is welcome to [sign up](https://chat.zulip.org/) and
participate — we love hearing from our users! Public streams in the
community receive thousands of messages a week. We recommend signing
up using the special invite links for
[users](https://chat.zulip.org/join/t5crtoe62bpcxyisiyglmtvb/),
[self-hosters](https://chat.zulip.org/join/wnhv3jzm6afa4raenedanfno/)
and
[contributors](https://chat.zulip.org/join/npzwak7vpmaknrhxthna3c7p/)
to get a curated list of initial stream subscriptions.
To learn how to get started participating in the community, including [community
norms](https://zulip.com/development-community/#community-norms) and [where to
post](https://zulip.com/development-community/#where-do-i-send-my-message),
check out our [Zulip development community
guide](https://zulip.com/development-community/). The Zulip community is
governed by a [code of
conduct](https://zulip.readthedocs.io/en/latest/code-of-conduct.html).
The
[Zulip community server](https://zulip.com/development-community/)
is the primary communication forum for the Zulip community. It is a good
place to start whether you have a question, are a new contributor, are a new
user, or anything else. Please review our
[community norms](https://zulip.com/development-community/#community-norms)
before posting. The Zulip community is also governed by a
[code of conduct](https://zulip.readthedocs.io/en/latest/code-of-conduct.html).
## Ways to contribute
@@ -55,14 +36,9 @@ needs doing:
**Non-code contributions**: Some of the most valuable ways to contribute
don't require touching the codebase at all. For example, you can:
- Report issues, including both [feature
requests](https://zulip.readthedocs.io/en/latest/contributing/suggesting-features.html)
and [bug
reports](https://zulip.readthedocs.io/en/latest/contributing/reporting-bugs.html).
- [Report issues](#reporting-issues), including both feature requests and
bug reports.
- [Give feedback](#user-feedback) if you are evaluating or using Zulip.
- [Participate
thoughtfully](https://zulip.readthedocs.io/en/latest/contributing/design-discussions.html)
in design discussions.
- [Sponsor Zulip](https://github.com/sponsors/zulip) through the GitHub sponsors program.
- [Translate](https://zulip.readthedocs.io/en/latest/translating/translating.html)
Zulip into your language.
@@ -78,13 +54,12 @@ to help.
- First, make an account on the
[Zulip community server](https://zulip.com/development-community/),
paying special attention to the
[community norms](https://zulip.com/development-community/#community-norms).
If you'd like, introduce yourself in
paying special attention to the community norms. If you'd like, introduce
yourself in
[#new members](https://chat.zulip.org/#narrow/stream/95-new-members), using
your name as the topic. Bonus: tell us about your first impressions of
Zulip, and anything that felt confusing/broken or interesting/helpful as you
started using the product.
Zulip, and anything that felt confusing/broken as you started using the
product.
- Read [What makes a great Zulip contributor](#what-makes-a-great-zulip-contributor).
- [Install the development environment](https://zulip.readthedocs.io/en/latest/development/overview.html),
getting help in
@@ -149,6 +124,14 @@ Note that you are _not_ claiming an issue while you are iterating through steps
1-4. _Before you claim an issue_, you should be confident that you will be able to
tackle it effectively.
If the lists of issues are overwhelming, you can post in
[#new members](https://chat.zulip.org/#narrow/stream/95-new-members) with a
bit about your background and interests, and we'll help you out. The most
important thing to say is whether you're looking for a backend (Python),
frontend (JavaScript and TypeScript), mobile (React Native), desktop (Electron),
documentation (English) or visual design (JavaScript/TypeScript + CSS) issue, and a
bit about your programming experience and available time.
Additional tips for the [main server and web app
repository](https://github.com/zulip/zulip/issues?q=is%3Aopen+is%3Aissue+label%3A%22help+wanted%22):
@@ -165,21 +148,15 @@ repository](https://github.com/zulip/zulip/issues?q=is%3Aopen+is%3Aissue+label%3
### Claiming an issue
#### In the main server/web app repository and Zulip Terminal repository
#### In the main server and web app repository
The Zulip server/web app repository
([`zulip/zulip`](https://github.com/zulip/zulip/)) and the Zulip Terminal
repository ([`zulip/zulip-terminal`](https://github.com/zulip/zulip-terminal/))
are set up with a GitHub workflow bot called
[Zulipbot](https://github.com/zulip/zulipbot), which manages issues and pull
requests in order to create a better workflow for Zulip contributors.
To claim an issue in these repositories, simply post a comment that says
`@zulipbot claim` to the issue thread. If the issue is tagged with a [help
After making sure the issue is tagged with a [help
wanted](https://github.com/zulip/zulip/issues?q=is%3Aopen+is%3Aissue+label%3A%22help+wanted%22)
label, Zulipbot will immediately assign the issue to you.
label, post a comment with `@zulipbot claim` to the issue thread.
[Zulipbot](https://github.com/zulip/zulipbot) is a GitHub workflow bot; it will
assign you to the issue and label the issue as "in progress".
Note that new contributors can only claim one issue until their first pull request is
New contributors can only claim one issue until their first pull request is
merged. This is to encourage folks to finish ongoing work before starting
something new. If you would like to pick up a new issue while waiting for review
on an almost-ready pull request, you can post a comment to this effect on the
@@ -187,11 +164,8 @@ issue you're interested in.
#### In other Zulip repositories
There is no bot for other Zulip repositories
([`zulip/zulip-mobile`](https://github.com/zulip/zulip-mobile/), etc.). If
you are interested in claiming an issue in one of these repositories, simply
post a comment on the issue thread saying that you'd like to work on it. There
is no need to @-mention the issue creator in your comment.
There is no bot for other repositories, so you can simply post a comment saying
that you'd like to work on the issue.
Please follow the same guidelines as described above: find an issue labeled
"help wanted", and only pick up one issue at a time to start with.
@@ -210,16 +184,95 @@ stream](https://chat.zulip.org/#narrow/stream/101-design) in the [Zulip
development community](https://zulip.com/development-community/)
For more advice, see [What makes a great Zulip
contributor?](#what-makes-a-great-zulip-contributor) below. It's OK if your
first issue takes you a while; that's normal! You'll be able to work a lot
faster as you build experience.
contributor?](https://zulip.readthedocs.io/en/latest/overview/contributing.html#what-makes-a-great-zulip-contributor)
below.
### Submitting a pull request
See the [pull request review
process](https://zulip.readthedocs.io/en/latest/contributing/review-process.html)
guide for detailed instructions on how to submit a pull request, and information
on the stages of review your PR will go through.
When you believe your code is ready, follow the [guide on how to review
code](https://zulip.readthedocs.io/en/latest/contributing/code-reviewing.html#how-to-review-code)
to review your own work. You can often find things you missed by taking a step
back to look over your work before asking others to do so. Catching mistakes
yourself will help your PRs be merged faster, and folks will appreciate the
quality and professionalism of your work.
Then, submit your changes. Carefully reading our [Git guide][git-guide], and in
particular the section on [making a pull request][git-guide-make-pr],
will help avoid many common mistakes.
Once you are satisfied with the quality of your PR, follow the
[guidelines on asking for a code
review](https://zulip.readthedocs.io/en/latest/contributing/code-reviewing.html#asking-for-a-code-review)
to request a review. If you are not sure what's best, simply post a
comment on the main GitHub thread for your PR clearly indicating that
it is ready for review, and the project maintainers will take a look
and follow up with next steps.
It's OK if your first issue takes you a while; that's normal! You'll be
able to work a lot faster as you build experience.
If it helps your workflow, you can submit a work-in-progress pull
request before your work is ready for review. Simply prefix the title
of work in progress pull requests with `[WIP]`, and then remove the
prefix when you think it's time for someone else to review your work.
[git-guide]: https://zulip.readthedocs.io/en/latest/git/
[git-guide-make-pr]: https://zulip.readthedocs.io/en/latest/git/pull-requests.html
### Stages of a pull request
Your pull request will likely go through several stages of review.
1. If your PR makes user-facing changes, the UI and user experience may be
reviewed early on, without reference to the code. You will get feedback on
any user-facing bugs in the implementation. To minimize the number of review
round-trips, make sure to [thoroughly
test](https://zulip.readthedocs.io/en/latest/contributing/code-reviewing.html#manual-testing)
your own PR prior to asking for review.
2. There may be choices made in the implementation that the reviewer
will ask you to revisit. This process will go more smoothly if you
specifically call attention to the decisions you made while
drafting the PR and any points about which you are uncertain. The
PR description and comments on your own PR are good ways to do this.
3. Oftentimes, seeing an initial implementation will make it clear that the
product design for a feature needs to be revised, or that additional changes
are needed. The reviewer may therefore ask you to amend or change the
implementation. Some changes may be blockers for getting the PR merged, while
others may be improvements that can happen afterwards. Feel free to ask if
it's unclear which type of feedback you're getting. (Follow-ups can be a
great next issue to work on!)
4. In addition to any UI/user experience review, all PRs will go through one or
more rounds of code review. Your code may initially be [reviewed by other
contributors](https://zulip.readthedocs.io/en/latest/contributing/code-reviewing.html).
This helps us make good use of project maintainers' time, and helps you make
progress on the PR by getting more frequent feedback. A project maintainer
may leave a comment asking someone with expertise in the area you're working
on to review your work.
5. Final code review and integration for server and webapp PRs is generally done
by `@timabbott`.
#### How to help move the review process forward
The key to keeping your review moving through the review process is to:
- Address _all_ the feedback to the best of your ability.
- Make it clear when the requested changes have been made
and you believe it's time for another look.
- Make it as easy as possible to review the changes you made.
In order to do this, when you believe you have addressed the previous round of
feedback on your PR as best you can, post an comment asking reviewers to take
another look. Your comment should make it easy to understand what has been done
and what remains by:
- Summarizing the changes made since the last review you received.
- Highlighting remaining questions or decisions, with links to any relevant
chat.zulip.org threads.
- Providing updated screenshots and information on manual testing if
appropriate.
The easier it is to review your work, the more likely you are to receive quick
feedback.
### Beyond the first issue
@@ -250,23 +303,12 @@ labels.
have a new feature you'd like to add, you can start a conversation [in our
development community](https://zulip.com/development-community/#where-do-i-send-my-message)
explaining the feature idea and the problem that you're hoping to solve.
- **I'm waiting for the next round of review on my PR. Can I pick up
another issue in the meantime?** Someone's first Zulip PR often
requires quite a bit of iteration, so please [make sure your pull
request is reviewable][reviewable-pull-requests] and go through at
least one round of feedback from others before picking up a second
issue. After that, sure! If
[Zulipbot](https://github.com/zulip/zulipbot) does not allow you to
claim an issue, you can post a comment describing the status of your
other work on the issue you're interested in, and asking for the
issue to be assigned to you. Note that addressing feedback on
in-progress PRs should always take priority over starting a new PR.
- **I think my PR is done, but it hasn't been merged yet. What's going on?**
1. **Double-check that you have addressed all the feedback**, including any comments
on [Git commit
discipline](https://zulip.readthedocs.io/en/latest/contributing/commit-discipline.html).
discipline](https://zulip.readthedocs.io/en/latest/contributing/version-control.html#commit-discipline).
2. If all the feedback has been addressed, did you [leave a
comment](https://zulip.readthedocs.io/en/latest/contributing/review-process.html#how-to-help-move-the-review-process-forward)
comment](https://zulip.readthedocs.io/en/latest/overview/contributing.html#how-to-help-move-the-review-process-forward)
explaining that you have done so and **requesting another review**? If not,
it may not be clear to project maintainers or reviewers that your PR is
ready for another look.
@@ -282,28 +324,26 @@ labels.
occasionally take a few weeks for a PR in the final stages of the review
process to be merged.
[reviewable-pull-requests]: https://zulip.readthedocs.io/en/latest/contributing/reviewable-prs.html
## What makes a great Zulip contributor?
Zulip has a lot of experience working with new contributors. In our
experience, these are the best predictors of success:
- [Asking great questions][great-questions]. It's very hard to answer a general
question like, "How do I do this issue?" When asking for help, explain your
current understanding, including what you've done or tried so far and where
- Posting good questions. It's very hard to answer a general question like, "How
do I do this issue?" When asking for help, explain
your current understanding, including what you've done or tried so far and where
you got stuck. Post tracebacks or other error messages if appropriate. For
more advice, check out [our guide][great-questions]!
more information, check out the ["Getting help" section of our community
guidelines](https://zulip.com/development-community/#getting-help) and
[this essay][good-questions-blog] for some good advice.
- Learning and practicing
[Git commit discipline](https://zulip.readthedocs.io/en/latest/contributing/commit-discipline.html).
[Git commit discipline](https://zulip.readthedocs.io/en/latest/contributing/version-control.html#commit-discipline).
- Submitting carefully tested code. See our [detailed guide on how to review
code](https://zulip.readthedocs.io/en/latest/contributing/code-reviewing.html#how-to-review-code)
(yours or someone else's).
- Posting
[screenshots or GIFs](https://zulip.readthedocs.io/en/latest/tutorials/screenshot-and-gif-software.html)
for frontend changes.
- Working to [make your pull requests easy to
review](https://zulip.readthedocs.io/en/latest/contributing/reviewable-prs.html).
- Clearly describing what you have implemented and why. For example, if your
implementation differs from the issue description in some way or is a partial
step towards the requirements described in the issue, be sure to call
@@ -314,7 +354,33 @@ experience, these are the best predictors of success:
- Being helpful and friendly on the [Zulip community
server](https://zulip.com/development-community/).
[great-questions]: https://zulip.readthedocs.io/en/latest/contributing/asking-great-questions.html
[good-questions-blog]: https://jvns.ca/blog/good-questions/
These are also the main criteria we use to select candidates for all
of our outreach programs.
## Reporting issues
If you find an easily reproducible bug and/or are experienced in reporting
bugs, feel free to just open an issue on the relevant project on GitHub.
If you have a feature request or are not yet sure what the underlying bug
is, the best place to post issues is
[#issues](https://chat.zulip.org/#narrow/stream/9-issues) (or
[#mobile](https://chat.zulip.org/#narrow/stream/48-mobile) or
[#desktop](https://chat.zulip.org/#narrow/stream/16-desktop)) on the
[Zulip community server](https://zulip.com/development-community/).
This allows us to interactively figure out what is going on, let you know if
a similar issue has already been opened, and collect any other information
we need. Choose a 2-4 word topic that describes the issue, explain the issue
and how to reproduce it if known, your browser/OS if relevant, and a
[screenshot or screenGIF](https://zulip.readthedocs.io/en/latest/tutorials/screenshot-and-gif-software.html)
if appropriate.
**Reporting security issues**. Please do not report security issues
publicly, including on public streams on chat.zulip.org. You can
email [security@zulip.com](mailto:security@zulip.com). We create a CVE for every
security issue in our released software.
## User feedback
@@ -343,20 +409,67 @@ by emailing [support@zulip.com](mailto:support@zulip.com).
## Outreach programs
Zulip regularly participates in [Google Summer of Code
(GSoC)](https://developers.google.com/open-source/gsoc/) and
[Outreachy](https://www.outreachy.org/). We have been a GSoC mentoring
organization since 2016, and we accept 15-20 GSoC participants each summer. In
the past, weve also participated in [Google
Code-In](https://developers.google.com/open-source/gci/), and hosted summer
interns from Harvard, MIT, and Stanford.
Zulip participates in [Google Summer of Code
(GSoC)](https://developers.google.com/open-source/gsoc/) every year.
In the past, we've also participated in
[Outreachy](https://www.outreachy.org/), [Google
Code-In](https://developers.google.com/open-source/gci/), and hosted
summer interns from Harvard, MIT, and Stanford.
Check out our [outreach programs
overview](https://zulip.readthedocs.io/en/latest/outreach/overview.html) to learn
more about participating in an outreach program with Zulip. Most of our program
participants end up sticking around the project long-term, and many have become
core team members, maintaining important parts of the project. We hope you
apply!
While each third-party program has its own rules and requirements, the
Zulip community's approaches all of these programs with these ideas in
mind:
- We try to make the application process as valuable for the applicant as
possible. Expect high-quality code reviews, a supportive community, and
publicly viewable patches you can link to from your resume, regardless of
whether you are selected.
- To apply, you'll have to submit at least one pull request to a Zulip
repository. Most students accepted to one of our programs have
several merged pull requests (including at least one larger PR) by
the time of the application deadline.
- The main criteria we use is quality of your best contributions, and
the bullets listed at
[What makes a great Zulip contributor](#what-makes-a-great-zulip-contributor).
Because we focus on evaluating your best work, it doesn't hurt your
application to makes mistakes in your first few PRs as long as your
work improves.
Most of our outreach program participants end up sticking around the
project long-term, and many have become core team members, maintaining
important parts of the project. We hope you apply!
### Google Summer of Code
The largest outreach program Zulip participates in is GSoC (14
students in 2017; 11 in 2018; 17 in 2019; 18 in 2020; 18 in 2021). While we
don't control how
many slots Google allocates to Zulip, we hope to mentor a similar
number of students in future summers. Check out our [blog
post](https://blog.zulip.com/2021/09/30/google-summer-of-code-2021/) to learn
about the GSoC 2021 experience and our participants' accomplishments.
If you're reading this well before the application deadline and want
to make your application strong, we recommend getting involved in the
community and fixing issues in Zulip now. Having good contributions
and building a reputation for doing good work is the best way to have
a strong application.
Our [GSoC program page][gsoc-guide] has lots more details on how
Zulip does GSoC, as well as project ideas. Note, however, that the project idea
list is maintained only during the GSoC application period, so if
you're looking at some other time of year, the project list is likely
out-of-date.
In some years, we have also run a Zulip Summer of Code (ZSoC)
program for students who we wanted to accept into GSoC but did not have an
official slot for. Student expectations are the
same as with GSoC, and ZSoC has no separate application process; your
GSoC application is your ZSoC application. If we'd like to select you
for ZSoC, we'll contact you when the GSoC results are announced.
[gsoc-guide]: https://zulip.readthedocs.io/en/latest/contributing/gsoc.html
[gsoc-faq]: https://developers.google.com/open-source/gsoc/faq
## Stay connected

View File

@@ -1,25 +1,15 @@
# This is a multiarch Dockerfile. See https://docs.docker.com/desktop/multi-arch/
#
# To set up the first time:
# docker buildx create --name multiarch --use
#
# To build:
# docker buildx build --platform linux/amd64,linux/arm64 \
# -f ./Dockerfile-postgresql -t zulip/zulip-postgresql:14 --push .
# To build run `docker build -f Dockerfile-postgresql .` from the root of the
# zulip repo.
# Currently the PostgreSQL images do not support automatic upgrading of
# the on-disk data in volumes. So the base image can not currently be upgraded
# without users needing a manual pgdump and restore.
# https://hub.docker.com/r/groonga/pgroonga/tags
ARG PGROONGA_VERSION=latest
ARG POSTGRESQL_VERSION=14
FROM groonga/pgroonga:$PGROONGA_VERSION-alpine-$POSTGRESQL_VERSION-slim
# Install hunspell, Zulip stop words, and run Zulip database
# init.
FROM groonga/pgroonga:latest-alpine-10-slim
RUN apk add -U --no-cache hunspell-en
RUN ln -sf /usr/share/hunspell/en_US.dic /usr/local/share/postgresql/tsearch_data/en_us.dict && ln -sf /usr/share/hunspell/en_US.aff /usr/local/share/postgresql/tsearch_data/en_us.affix
RUN ln -sf /usr/share/hunspell/en_US.dic /usr/local/share/postgresql/tsearch_data/en_us.dict && ln -sf /usr/share/hunspell/en_US.aff /usr/local/share/postgresql/tsearch_data/en_us.affix
COPY puppet/zulip/files/postgresql/zulip_english.stop /usr/local/share/postgresql/tsearch_data/zulip_english.stop
COPY scripts/setup/create-db.sql /docker-entrypoint-initdb.d/zulip-create-db.sql
COPY scripts/setup/create-pgroonga.sql /docker-entrypoint-initdb.d/zulip-create-pgroonga.sql

View File

@@ -17,7 +17,6 @@ Come find us on the [development community chat](https://zulip.com/development-c
[![GitHub Actions build status](https://github.com/zulip/zulip/actions/workflows/zulip-ci.yml/badge.svg)](https://github.com/zulip/zulip/actions/workflows/zulip-ci.yml?query=branch%3Amain)
[![coverage status](https://img.shields.io/codecov/c/github/zulip/zulip/main.svg)](https://codecov.io/gh/zulip/zulip)
[![Mypy coverage](https://img.shields.io/badge/mypy-100%25-green.svg)][mypy-coverage]
[![Ruff](https://img.shields.io/endpoint?url=https://raw.githubusercontent.com/charliermarsh/ruff/main/assets/badge/v1.json)](https://github.com/charliermarsh/ruff)
[![code style: black](https://img.shields.io/badge/code%20style-black-000000.svg)](https://github.com/psf/black)
[![code style: prettier](https://img.shields.io/badge/code_style-prettier-ff69b4.svg)](https://github.com/prettier/prettier)
[![GitHub release](https://img.shields.io/github/release/zulip/zulip.svg)](https://github.com/zulip/zulip/releases/latest)
@@ -34,17 +33,16 @@ Come find us on the [development community chat](https://zulip.com/development-c
## Getting started
- **Contributing code**. Check out our [guide for new
contributors](https://zulip.readthedocs.io/en/latest/contributing/contributing.html)
to get started. We have invested in making Zulips code highly
readable, thoughtfully tested, and easy to modify. Beyond that, we
have written an extraordinary 150K words of documentation for Zulip
contributors.
contributors](https://zulip.readthedocs.io/en/latest/overview/contributing.html)
to get started. We have invested into making Zulips code uniquely readable,
well tested, and easy to modify. Beyond that, we have written an extraordinary
150K words of documentation on how to contribute to Zulip.
- **Contributing non-code**. [Report an
issue](https://zulip.readthedocs.io/en/latest/contributing/contributing.html#reporting-issues),
issue](https://zulip.readthedocs.io/en/latest/overview/contributing.html#reporting-issues),
[translate](https://zulip.readthedocs.io/en/latest/translating/translating.html)
Zulip into your language, or [give us
feedback](https://zulip.readthedocs.io/en/latest/contributing/contributing.html#user-feedback).
feedback](https://zulip.readthedocs.io/en/latest/overview/contributing.html#user-feedback).
We'd love to hear from you, whether you've been using Zulip for years, or are just
trying it out for the first time.
@@ -53,7 +51,7 @@ Come find us on the [development community chat](https://zulip.com/development-c
recommend reading about Zulip's [unique
approach](https://zulip.com/why-zulip/) to organizing conversations.
- **Running a Zulip server**. Self-host Zulip directly on Ubuntu or Debian
- **Running a Zulip server**. Self host Zulip directly on Ubuntu or Debian
Linux, in [Docker](https://github.com/zulip/docker-zulip), or with prebuilt
images for [Digital Ocean](https://marketplace.digitalocean.com/apps/zulip) and
[Render](https://render.com/docs/deploy-zulip).
@@ -66,14 +64,14 @@ Come find us on the [development community chat](https://zulip.com/development-c
projects](https://zulip.com/for/open-source/).
- **Participating in [outreach
programs](https://zulip.readthedocs.io/en/latest/contributing/contributing.html#outreach-programs)**
programs](https://zulip.readthedocs.io/en/latest/overview/contributing.html#outreach-programs)**
like [Google Summer of Code](https://developers.google.com/open-source/gsoc/)
and [Outreachy](https://www.outreachy.org/).
- **Supporting Zulip**. Advocate for your organization to use Zulip, become a
[sponsor](https://github.com/sponsors/zulip), write a review in the mobile app
stores, or [help others find
Zulip](https://zulip.readthedocs.io/en/latest/contributing/contributing.html#help-others-find-zulip).
Zulip](https://zulip.readthedocs.io/en/latest/overview/contributing.html#help-others-find-zulip).
You may also be interested in reading our [blog](https://blog.zulip.org/), and
following us on [Twitter](https://twitter.com/zulip) and

20
Vagrantfile vendored
View File

@@ -12,13 +12,11 @@ Vagrant.configure("2") do |config|
vm_num_cpus = "2"
vm_memory = "2048"
ubuntu_mirror = ""
debian_mirror = ""
vboxadd_version = nil
config.vm.box = "bento/ubuntu-20.04"
config.vm.synced_folder ".", "/vagrant", disabled: true
config.vm.synced_folder ".", "/srv/zulip", docker_consistency: "z"
config.vm.synced_folder ".", "/srv/zulip"
vagrant_config_file = ENV["HOME"] + "/.zulip-vagrant-config"
if File.file?(vagrant_config_file)
@@ -34,7 +32,7 @@ Vagrant.configure("2") do |config|
when "HOST_IP_ADDR"; host_ip_addr = value
when "GUEST_CPUS"; vm_num_cpus = value
when "GUEST_MEMORY_MB"; vm_memory = value
when "UBUNTU_MIRROR"; ubuntu_mirror = value
when "DEBIAN_MIRROR"; debian_mirror = value
when "VBOXADD_VERSION"; vboxadd_version = value
end
end
@@ -63,23 +61,23 @@ Vagrant.configure("2") do |config|
config.vm.network "forwarded_port", guest: 9994, host: host_port + 3, host_ip: host_ip_addr
# Specify Docker provider before VirtualBox provider so it's preferred.
config.vm.provider "docker" do |d, override|
override.vm.box = nil
d.build_dir = File.join(__dir__, "tools", "setup", "dev-vagrant-docker")
d.build_args = ["--build-arg", "VAGRANT_UID=#{Process.uid}"]
if !ubuntu_mirror.empty?
d.build_args += ["--build-arg", "UBUNTU_MIRROR=#{ubuntu_mirror}"]
if !debian_mirror.empty?
d.build_args += ["--build-arg", "DEBIAN_MIRROR=#{debian_mirror}"]
end
d.has_ssh = true
d.create_args = ["--ulimit", "nofile=1024:65536"]
end
config.vm.provider "virtualbox" do |vb, override|
override.vm.box = "bento/debian-10"
# It's possible we can get away with just 1.5GB; more testing needed
vb.memory = vm_memory
vb.cpus = vm_num_cpus
if !vboxadd_version.nil?
override.vbguest.installer = Class.new(VagrantVbguest::Installers::Ubuntu) do
override.vbguest.installer = Class.new(VagrantVbguest::Installers::Debian) do
define_method(:host_version) do |reload = false|
VagrantVbguest::Version(vboxadd_version)
end
@@ -90,12 +88,14 @@ Vagrant.configure("2") do |config|
end
config.vm.provider "hyperv" do |h, override|
override.vm.box = "bento/debian-10"
h.memory = vm_memory
h.maxmemory = vm_memory
h.cpus = vm_num_cpus
end
config.vm.provider "parallels" do |prl, override|
override.vm.box = "bento/debian-10"
prl.memory = vm_memory
prl.cpus = vm_num_cpus
end
@@ -104,5 +104,5 @@ Vagrant.configure("2") do |config|
# We want provision to be run with the permissions of the vagrant user.
privileged: false,
path: "tools/setup/vagrant-provision",
env: { "UBUNTU_MIRROR" => ubuntu_mirror }
env: { "DEBIAN_MIRROR" => debian_mirror }
end

View File

@@ -62,7 +62,7 @@ class CountStat:
else:
self.interval = self.time_increment
def __repr__(self) -> str:
def __str__(self) -> str:
return f"<CountStat: {self.property}>"
def last_successful_fill(self) -> Optional[datetime]:
@@ -206,7 +206,7 @@ def do_aggregate_to_summary_table(
# Aggregate into RealmCount
output_table = stat.data_collector.output_table
if realm is not None:
realm_clause: Composable = SQL("AND zerver_realm.id = {}").format(Literal(realm.id))
realm_clause = SQL("AND zerver_realm.id = {}").format(Literal(realm.id))
else:
realm_clause = SQL("")
@@ -288,7 +288,6 @@ def do_aggregate_to_summary_table(
## Utility functions called from outside counts.py ##
# called from zerver.actions; should not throw any errors
def do_increment_logging_stat(
zerver_object: Union[Realm, UserProfile, Stream],
@@ -358,7 +357,7 @@ def do_pull_by_sql_query(
) -> int:
if group_by is None:
subgroup: Composable = SQL("NULL")
group_by_clause: Composable = SQL("")
group_by_clause = SQL("")
else:
subgroup = Identifier(group_by[0]._meta.db_table, group_by[1])
group_by_clause = SQL(", {}").format(subgroup)
@@ -444,7 +443,7 @@ def do_pull_minutes_active(
def count_message_by_user_query(realm: Optional[Realm]) -> QueryFn:
if realm is None:
realm_clause: Composable = SQL("")
realm_clause = SQL("")
else:
realm_clause = SQL("zerver_userprofile.realm_id = {} AND").format(Literal(realm.id))
return lambda kwargs: SQL(
@@ -471,7 +470,7 @@ def count_message_by_user_query(realm: Optional[Realm]) -> QueryFn:
# Note: ignores the group_by / group_by_clause.
def count_message_type_by_user_query(realm: Optional[Realm]) -> QueryFn:
if realm is None:
realm_clause: Composable = SQL("")
realm_clause = SQL("")
else:
realm_clause = SQL("zerver_userprofile.realm_id = {} AND").format(Literal(realm.id))
return lambda kwargs: SQL(
@@ -520,7 +519,7 @@ def count_message_type_by_user_query(realm: Optional[Realm]) -> QueryFn:
# table, consider writing a new query for efficiency.
def count_message_by_stream_query(realm: Optional[Realm]) -> QueryFn:
if realm is None:
realm_clause: Composable = SQL("")
realm_clause = SQL("")
else:
realm_clause = SQL("zerver_stream.realm_id = {} AND").format(Literal(realm.id))
return lambda kwargs: SQL(
@@ -554,7 +553,7 @@ def count_message_by_stream_query(realm: Optional[Realm]) -> QueryFn:
# currently the only stat that uses this.
def count_user_by_realm_query(realm: Optional[Realm]) -> QueryFn:
if realm is None:
realm_clause: Composable = SQL("")
realm_clause = SQL("")
else:
realm_clause = SQL("zerver_userprofile.realm_id = {} AND").format(Literal(realm.id))
return lambda kwargs: SQL(
@@ -584,7 +583,7 @@ def count_user_by_realm_query(realm: Optional[Realm]) -> QueryFn:
# In particular, it's important to ensure that migrations don't cause that to happen.
def check_realmauditlog_by_user_query(realm: Optional[Realm]) -> QueryFn:
if realm is None:
realm_clause: Composable = SQL("")
realm_clause = SQL("")
else:
realm_clause = SQL("realm_id = {} AND").format(Literal(realm.id))
return lambda kwargs: SQL(
@@ -624,7 +623,7 @@ def check_realmauditlog_by_user_query(realm: Optional[Realm]) -> QueryFn:
def check_useractivityinterval_by_user_query(realm: Optional[Realm]) -> QueryFn:
if realm is None:
realm_clause: Composable = SQL("")
realm_clause = SQL("")
else:
realm_clause = SQL("zerver_userprofile.realm_id = {} AND").format(Literal(realm.id))
return lambda kwargs: SQL(
@@ -648,7 +647,7 @@ def check_useractivityinterval_by_user_query(realm: Optional[Realm]) -> QueryFn:
def count_realm_active_humans_query(realm: Optional[Realm]) -> QueryFn:
if realm is None:
realm_clause: Composable = SQL("")
realm_clause = SQL("")
else:
realm_clause = SQL("realm_id = {} AND").format(Literal(realm.id))
return lambda kwargs: SQL(

View File

@@ -8,7 +8,7 @@ from django.utils.timezone import now as timezone_now
from analytics.lib.counts import COUNT_STATS, CountStat
from analytics.models import installation_epoch
from zerver.lib.timestamp import TimeZoneNotUTCError, floor_to_day, floor_to_hour, verify_UTC
from zerver.lib.timestamp import TimeZoneNotUTCException, floor_to_day, floor_to_hour, verify_UTC
from zerver.models import Realm
states = {
@@ -48,7 +48,7 @@ class Command(BaseCommand):
last_fill = installation_epoch()
try:
verify_UTC(last_fill)
except TimeZoneNotUTCError:
except TimeZoneNotUTCException:
return {"status": 2, "message": f"FillState not in UTC for {property}"}
if stat.frequency == CountStat.DAY:

View File

@@ -1,8 +1,7 @@
import os
from datetime import timedelta
from typing import Any, Dict, List, Mapping, Type, Union
from unittest import mock
from django.core.files.uploadedfile import UploadedFile
from django.core.management.base import BaseCommand
from django.utils.timezone import now as timezone_now
@@ -20,20 +19,9 @@ from analytics.models import (
from zerver.actions.create_realm import do_create_realm
from zerver.actions.users import do_change_user_role
from zerver.lib.create_user import create_user
from zerver.lib.storage import static_path
from zerver.lib.stream_color import STREAM_ASSIGNMENT_COLORS
from zerver.lib.timestamp import floor_to_day
from zerver.lib.upload import upload_message_attachment_from_request
from zerver.models import (
Client,
Realm,
RealmAuditLog,
Recipient,
Stream,
Subscription,
UserGroup,
UserProfile,
)
from zerver.models import Client, Realm, Recipient, Stream, Subscription, UserProfile
class Command(BaseCommand):
@@ -91,61 +79,31 @@ class Command(BaseCommand):
string_id="analytics", name="Analytics", date_created=installation_time
)
shylock = create_user(
"shylock@analytics.ds",
"Shylock",
realm,
full_name="Shylock",
role=UserProfile.ROLE_REALM_OWNER,
force_date_joined=installation_time,
)
with mock.patch("zerver.lib.create_user.timezone_now", return_value=installation_time):
shylock = create_user(
"shylock@analytics.ds",
"Shylock",
realm,
full_name="Shylock",
role=UserProfile.ROLE_REALM_OWNER,
)
do_change_user_role(shylock, UserProfile.ROLE_REALM_OWNER, acting_user=None)
# Create guest user for set_guest_users_statistic.
create_user(
"bassanio@analytics.ds",
"Bassanio",
realm,
full_name="Bassanio",
role=UserProfile.ROLE_GUEST,
force_date_joined=installation_time,
)
administrators_user_group = UserGroup.objects.get(
name=UserGroup.ADMINISTRATORS_GROUP_NAME, realm=realm, is_system_group=True
)
stream = Stream.objects.create(
name="all",
realm=realm,
date_created=installation_time,
can_remove_subscribers_group=administrators_user_group,
)
stream = Stream.objects.create(name="all", realm=realm, date_created=installation_time)
recipient = Recipient.objects.create(type_id=stream.id, type=Recipient.STREAM)
stream.recipient = recipient
stream.save(update_fields=["recipient"])
# Subscribe shylock to the stream to avoid invariant failures.
Subscription.objects.create(
recipient=recipient,
user_profile=shylock,
is_user_active=shylock.is_active,
color=STREAM_ASSIGNMENT_COLORS[0],
)
RealmAuditLog.objects.create(
realm=realm,
modified_user=shylock,
modified_stream=stream,
event_last_message_id=0,
event_type=RealmAuditLog.SUBSCRIPTION_CREATED,
event_time=installation_time,
)
# Create an attachment in the database for set_storage_space_used_statistic.
IMAGE_FILE_PATH = static_path("images/test-images/checkbox.png")
file_info = os.stat(IMAGE_FILE_PATH)
file_size = file_info.st_size
with open(IMAGE_FILE_PATH, "rb") as fp:
upload_message_attachment_from_request(UploadedFile(fp), shylock, file_size)
# TODO: This should use subscribe_users_to_streams from populate_db.
subs = [
Subscription(
recipient=recipient,
user_profile=shylock,
is_user_active=shylock.is_active,
color=STREAM_ASSIGNMENT_COLORS[0],
),
]
Subscription.objects.bulk_create(subs)
FixtureData = Mapping[Union[str, int, None], List[int]]

View File

@@ -4,6 +4,7 @@ from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
("zerver", "0030_realm_org_type"),
migrations.swappable_dependency(settings.AUTH_USER_MODEL),

View File

@@ -2,6 +2,7 @@ from django.db import migrations
class Migration(migrations.Migration):
dependencies = [
("analytics", "0001_initial"),
]

View File

@@ -2,6 +2,7 @@ from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
("analytics", "0002_remove_huddlecount"),
]

View File

@@ -2,6 +2,7 @@ from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
("analytics", "0003_fillstate"),
]

View File

@@ -2,6 +2,7 @@ from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
("analytics", "0004_add_subgroup"),
]

View File

@@ -2,6 +2,7 @@ from django.db import migrations
class Migration(migrations.Migration):
dependencies = [
("analytics", "0005_alter_field_size"),
]

View File

@@ -3,6 +3,7 @@ from django.db import migrations
class Migration(migrations.Migration):
dependencies = [
("analytics", "0006_add_subgroup_to_unique_constraints"),
]

View File

@@ -3,6 +3,7 @@ from django.db import migrations
class Migration(migrations.Migration):
dependencies = [
("zerver", "0050_userprofile_avatar_version"),
("analytics", "0007_remove_interval"),

View File

@@ -1,10 +1,10 @@
from django.db import migrations
from django.db.backends.base.schema import BaseDatabaseSchemaEditor
from django.db.backends.postgresql.schema import DatabaseSchemaEditor
from django.db.migrations.state import StateApps
def delete_messages_sent_to_stream_stat(
apps: StateApps, schema_editor: BaseDatabaseSchemaEditor
apps: StateApps, schema_editor: DatabaseSchemaEditor
) -> None:
UserCount = apps.get_model("analytics", "UserCount")
StreamCount = apps.get_model("analytics", "StreamCount")
@@ -21,6 +21,7 @@ def delete_messages_sent_to_stream_stat(
class Migration(migrations.Migration):
dependencies = [
("analytics", "0008_add_count_indexes"),
]

View File

@@ -1,10 +1,10 @@
from django.db import migrations
from django.db.backends.base.schema import BaseDatabaseSchemaEditor
from django.db.backends.postgresql.schema import DatabaseSchemaEditor
from django.db.migrations.state import StateApps
def clear_message_sent_by_message_type_values(
apps: StateApps, schema_editor: BaseDatabaseSchemaEditor
apps: StateApps, schema_editor: DatabaseSchemaEditor
) -> None:
UserCount = apps.get_model("analytics", "UserCount")
StreamCount = apps.get_model("analytics", "StreamCount")
@@ -21,6 +21,7 @@ def clear_message_sent_by_message_type_values(
class Migration(migrations.Migration):
dependencies = [("analytics", "0009_remove_messages_to_stream_stat")]
operations = [

View File

@@ -1,9 +1,9 @@
from django.db import migrations
from django.db.backends.base.schema import BaseDatabaseSchemaEditor
from django.db.backends.postgresql.schema import DatabaseSchemaEditor
from django.db.migrations.state import StateApps
def clear_analytics_tables(apps: StateApps, schema_editor: BaseDatabaseSchemaEditor) -> None:
def clear_analytics_tables(apps: StateApps, schema_editor: DatabaseSchemaEditor) -> None:
UserCount = apps.get_model("analytics", "UserCount")
StreamCount = apps.get_model("analytics", "StreamCount")
RealmCount = apps.get_model("analytics", "RealmCount")
@@ -18,6 +18,7 @@ def clear_analytics_tables(apps: StateApps, schema_editor: BaseDatabaseSchemaEdi
class Migration(migrations.Migration):
dependencies = [
("analytics", "0010_clear_messages_sent_values"),
]

View File

@@ -5,6 +5,7 @@ from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
("analytics", "0011_clear_analytics_tables"),
]

View File

@@ -4,6 +4,7 @@ from django.db import migrations
class Migration(migrations.Migration):
dependencies = [
("analytics", "0012_add_on_delete"),
]

View File

@@ -4,6 +4,7 @@ from django.db import migrations
class Migration(migrations.Migration):
dependencies = [
("analytics", "0013_remove_anomaly"),
]

View File

@@ -1,10 +1,10 @@
from django.db import migrations
from django.db.backends.base.schema import BaseDatabaseSchemaEditor
from django.db.backends.postgresql.schema import DatabaseSchemaEditor
from django.db.migrations.state import StateApps
from django.db.models import Count, Sum
def clear_duplicate_counts(apps: StateApps, schema_editor: BaseDatabaseSchemaEditor) -> None:
def clear_duplicate_counts(apps: StateApps, schema_editor: DatabaseSchemaEditor) -> None:
"""This is a preparatory migration for our Analytics tables.
The backstory is that Django's unique_together indexes do not properly
@@ -55,6 +55,7 @@ def clear_duplicate_counts(apps: StateApps, schema_editor: BaseDatabaseSchemaEdi
class Migration(migrations.Migration):
dependencies = [
("analytics", "0014_remove_fillstate_last_modified"),
]

View File

@@ -4,6 +4,7 @@ from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
("analytics", "0015_clear_duplicate_counts"),
]

View File

@@ -1,4 +1,5 @@
import datetime
from typing import Optional
from django.db import models
from django.db.models import Q, UniqueConstraint
@@ -8,16 +9,16 @@ from zerver.models import Realm, Stream, UserProfile
class FillState(models.Model):
property = models.CharField(max_length=40, unique=True)
end_time = models.DateTimeField()
property: str = models.CharField(max_length=40, unique=True)
end_time: datetime.datetime = models.DateTimeField()
# Valid states are {DONE, STARTED}
DONE = 1
STARTED = 2
state = models.PositiveSmallIntegerField()
state: int = models.PositiveSmallIntegerField()
def __str__(self) -> str:
return f"{self.property} {self.end_time} {self.state}"
return f"<FillState: {self.property} {self.end_time} {self.state}>"
# The earliest/starting end_time in FillState
@@ -33,10 +34,10 @@ class BaseCount(models.Model):
# Note: When inheriting from BaseCount, you may want to rearrange
# the order of the columns in the migration to make sure they
# match how you'd like the table to be arranged.
property = models.CharField(max_length=32)
subgroup = models.CharField(max_length=16, null=True)
end_time = models.DateTimeField()
value = models.BigIntegerField()
property: str = models.CharField(max_length=32)
subgroup: Optional[str] = models.CharField(max_length=16, null=True)
end_time: datetime.datetime = models.DateTimeField()
value: int = models.BigIntegerField()
class Meta:
abstract = True
@@ -59,7 +60,7 @@ class InstallationCount(BaseCount):
]
def __str__(self) -> str:
return f"{self.property} {self.subgroup} {self.value}"
return f"<InstallationCount: {self.property} {self.subgroup} {self.value}>"
class RealmCount(BaseCount):
@@ -82,7 +83,7 @@ class RealmCount(BaseCount):
index_together = ["property", "end_time"]
def __str__(self) -> str:
return f"{self.realm!r} {self.property} {self.subgroup} {self.value}"
return f"<RealmCount: {self.realm} {self.property} {self.subgroup} {self.value}>"
class UserCount(BaseCount):
@@ -108,7 +109,7 @@ class UserCount(BaseCount):
index_together = ["property", "realm", "end_time"]
def __str__(self) -> str:
return f"{self.user!r} {self.property} {self.subgroup} {self.value}"
return f"<UserCount: {self.user} {self.property} {self.subgroup} {self.value}>"
class StreamCount(BaseCount):
@@ -134,4 +135,6 @@ class StreamCount(BaseCount):
index_together = ["property", "realm", "end_time"]
def __str__(self) -> str:
return f"{self.stream!r} {self.property} {self.subgroup} {self.value} {self.id}"
return (
f"<StreamCount: {self.stream} {self.property} {self.subgroup} {self.value} {self.id}>"
)

View File

@@ -3,6 +3,7 @@ from unittest import mock
from django.utils.timezone import now as timezone_now
from zerver.lib.test_classes import ZulipTestCase
from zerver.lib.test_helpers import queries_captured
from zerver.models import Client, UserActivity, UserProfile, flush_per_request_caches
@@ -32,17 +33,23 @@ class ActivityTest(ZulipTestCase):
user_profile.save(update_fields=["is_staff"])
flush_per_request_caches()
with self.assert_database_query_count(18):
with queries_captured() as queries:
result = self.client_get("/activity")
self.assertEqual(result.status_code, 200)
self.assert_length(queries, 19)
flush_per_request_caches()
with self.assert_database_query_count(8):
with queries_captured() as queries:
result = self.client_get("/realm_activity/zulip/")
self.assertEqual(result.status_code, 200)
self.assert_length(queries, 8)
iago = self.example_user("iago")
flush_per_request_caches()
with self.assert_database_query_count(5):
with queries_captured() as queries:
result = self.client_get(f"/user_activity/{iago.id}/")
self.assertEqual(result.status_code, 200)
self.assert_length(queries, 5)

View File

@@ -53,7 +53,7 @@ from zerver.actions.users import do_deactivate_user
from zerver.lib.create_user import create_user
from zerver.lib.exceptions import InvitationError
from zerver.lib.test_classes import ZulipTestCase
from zerver.lib.timestamp import TimeZoneNotUTCError, floor_to_day
from zerver.lib.timestamp import TimeZoneNotUTCException, floor_to_day
from zerver.lib.topic import DB_TOPIC_NAME
from zerver.lib.utils import assert_is_not_none
from zerver.models import (
@@ -66,11 +66,9 @@ from zerver.models import (
Recipient,
Stream,
UserActivityInterval,
UserGroup,
UserProfile,
get_client,
get_user,
is_cross_realm_bot_email,
)
@@ -86,13 +84,10 @@ class AnalyticsTestCase(ZulipTestCase):
self.default_realm = do_create_realm(
string_id="realmtest", name="Realm Test", date_created=self.TIME_ZERO - 2 * self.DAY
)
self.administrators_user_group = UserGroup.objects.get(
name=UserGroup.ADMINISTRATORS_GROUP_NAME, realm=self.default_realm, is_system_group=True
)
# used to generate unique names in self.create_*
self.name_counter = 100
# used as defaults in self.assert_table_count
# used as defaults in self.assertCountEquals
self.current_property: Optional[str] = None
# Lightweight creation of users, streams, and messages
@@ -130,7 +125,6 @@ class AnalyticsTestCase(ZulipTestCase):
"name": f"stream name {self.name_counter}",
"realm": self.default_realm,
"date_created": self.TIME_LAST_HOUR,
"can_remove_subscribers_group": self.administrators_user_group,
}
for key, value in defaults.items():
kwargs[key] = kwargs.get(key, value)
@@ -159,18 +153,13 @@ class AnalyticsTestCase(ZulipTestCase):
"content": "hi",
"date_sent": self.TIME_LAST_HOUR,
"sending_client": get_client("website"),
"realm_id": sender.realm_id,
}
# For simplicity, this helper doesn't support creating cross-realm messages
# since it'd require adding an additional realm argument.
assert not is_cross_realm_bot_email(sender.delivery_email)
for key, value in defaults.items():
kwargs[key] = kwargs.get(key, value)
return Message.objects.create(**kwargs)
# kwargs should only ever be a UserProfile or Stream.
def assert_table_count(
def assertCountEquals(
self,
table: Type[BaseCount],
value: int,
@@ -227,13 +216,14 @@ class AnalyticsTestCase(ZulipTestCase):
kwargs[arg_keys[i]] = values[i]
for key, value in defaults.items():
kwargs[key] = kwargs.get(key, value)
if table is not InstallationCount and "realm" not in kwargs:
if "user" in kwargs:
kwargs["realm"] = kwargs["user"].realm
elif "stream" in kwargs:
kwargs["realm"] = kwargs["stream"].realm
else:
kwargs["realm"] = self.default_realm
if table is not InstallationCount:
if "realm" not in kwargs:
if "user" in kwargs:
kwargs["realm"] = kwargs["user"].realm
elif "stream" in kwargs:
kwargs["realm"] = kwargs["stream"].realm
else:
kwargs["realm"] = self.default_realm
self.assertEqual(table.objects.filter(**kwargs).count(), 1)
self.assert_length(arg_values, table.objects.count())
@@ -289,7 +279,7 @@ class TestProcessCountStat(AnalyticsTestCase):
stat = self.make_dummy_count_stat("test stat")
with self.assertRaises(ValueError):
process_count_stat(stat, installation_epoch() + 65 * self.MINUTE)
with self.assertRaises(TimeZoneNotUTCError):
with self.assertRaises(TimeZoneNotUTCException):
process_count_stat(stat, installation_epoch().replace(tzinfo=None))
# This tests the LoggingCountStat branch of the code in do_delete_counts_at_hour.
@@ -773,9 +763,9 @@ class TestCountStats(AnalyticsTestCase):
do_fill_count_stat_at_hour(stat, self.TIME_ZERO)
self.assert_table_count(UserCount, 1, subgroup="private_message")
self.assert_table_count(UserCount, 1, subgroup="huddle_message")
self.assert_table_count(UserCount, 1, subgroup="public_stream")
self.assertCountEquals(UserCount, 1, subgroup="private_message")
self.assertCountEquals(UserCount, 1, subgroup="huddle_message")
self.assertCountEquals(UserCount, 1, subgroup="public_stream")
def test_messages_sent_by_client(self) -> None:
stat = COUNT_STATS["messages_sent:client:day"]
@@ -1381,49 +1371,47 @@ class TestLoggingCountStats(AnalyticsTestCase):
user = self.create_user(email="first@domain.tld")
stream, _ = self.create_stream_with_recipient()
invite_expires_in_minutes = 2 * 24 * 60
with mock.patch("zerver.actions.invites.too_many_recent_realm_invites", return_value=False):
do_invite_users(
user,
["user1@domain.tld", "user2@domain.tld"],
[stream],
invite_expires_in_minutes=invite_expires_in_minutes,
)
invite_expires_in_days = 2
do_invite_users(
user,
["user1@domain.tld", "user2@domain.tld"],
[stream],
invite_expires_in_days=invite_expires_in_days,
)
assertInviteCountEquals(2)
# We currently send emails when re-inviting users that haven't
# turned into accounts, so count them towards the total
with mock.patch("zerver.actions.invites.too_many_recent_realm_invites", return_value=False):
do_invite_users(
user,
["user1@domain.tld", "user2@domain.tld"],
[stream],
invite_expires_in_minutes=invite_expires_in_minutes,
)
do_invite_users(
user,
["user1@domain.tld", "user2@domain.tld"],
[stream],
invite_expires_in_days=invite_expires_in_days,
)
assertInviteCountEquals(4)
# Test mix of good and malformed invite emails
with self.assertRaises(InvitationError), mock.patch(
"zerver.actions.invites.too_many_recent_realm_invites", return_value=False
):
try:
do_invite_users(
user,
["user3@domain.tld", "malformed"],
[stream],
invite_expires_in_minutes=invite_expires_in_minutes,
invite_expires_in_days=invite_expires_in_days,
)
except InvitationError:
pass
assertInviteCountEquals(4)
# Test inviting existing users
with self.assertRaises(InvitationError), mock.patch(
"zerver.actions.invites.too_many_recent_realm_invites", return_value=False
):
try:
do_invite_users(
user,
["first@domain.tld", "user4@domain.tld"],
[stream],
invite_expires_in_minutes=invite_expires_in_minutes,
invite_expires_in_days=invite_expires_in_days,
)
except InvitationError:
pass
assertInviteCountEquals(5)
# Revoking invite should not give you credit
@@ -1433,8 +1421,7 @@ class TestLoggingCountStats(AnalyticsTestCase):
assertInviteCountEquals(5)
# Resending invite should cost you
with mock.patch("zerver.actions.invites.too_many_recent_realm_invites", return_value=False):
do_resend_user_invite_email(assert_is_not_none(PreregistrationUser.objects.first()))
do_resend_user_invite_email(assert_is_not_none(PreregistrationUser.objects.first()))
assertInviteCountEquals(6)
def test_messages_read_hour(self) -> None:

View File

@@ -124,7 +124,8 @@ class TestGetChartData(ZulipTestCase):
stat = COUNT_STATS["active_users_audit:is_bot:day"]
self.insert_data(stat, ["false"], [])
result = self.client_get("/json/analytics/chart_data", {"chart_name": "number_of_humans"})
data = self.assert_json_success(result)
self.assert_json_success(result)
data = result.json()
self.assertEqual(
data,
{
@@ -147,7 +148,8 @@ class TestGetChartData(ZulipTestCase):
result = self.client_get(
"/json/analytics/chart_data", {"chart_name": "messages_sent_over_time"}
)
data = self.assert_json_success(result)
self.assert_json_success(result)
data = result.json()
self.assertEqual(
data,
{
@@ -169,7 +171,8 @@ class TestGetChartData(ZulipTestCase):
result = self.client_get(
"/json/analytics/chart_data", {"chart_name": "messages_sent_by_message_type"}
)
data = self.assert_json_success(result)
self.assert_json_success(result)
data = result.json()
self.assertEqual(
data,
{
@@ -179,20 +182,20 @@ class TestGetChartData(ZulipTestCase):
"everyone": {
"Public streams": self.data(100),
"Private streams": self.data(0),
"Direct messages": self.data(101),
"Group direct messages": self.data(0),
"Private messages": self.data(101),
"Group private messages": self.data(0),
},
"user": {
"Public streams": self.data(200),
"Private streams": self.data(201),
"Direct messages": self.data(0),
"Group direct messages": self.data(0),
"Private messages": self.data(0),
"Group private messages": self.data(0),
},
"display_order": [
"Direct messages",
"Private messages",
"Public streams",
"Private streams",
"Group direct messages",
"Group private messages",
],
"result": "success",
},
@@ -212,7 +215,8 @@ class TestGetChartData(ZulipTestCase):
result = self.client_get(
"/json/analytics/chart_data", {"chart_name": "messages_sent_by_client"}
)
data = self.assert_json_success(result)
self.assert_json_success(result)
data = result.json()
self.assertEqual(
data,
{
@@ -236,7 +240,8 @@ class TestGetChartData(ZulipTestCase):
result = self.client_get(
"/json/analytics/chart_data", {"chart_name": "messages_read_over_time"}
)
data = self.assert_json_success(result)
self.assert_json_success(result)
data = result.json()
self.assertEqual(
data,
{
@@ -257,7 +262,8 @@ class TestGetChartData(ZulipTestCase):
state=FillState.DONE,
)
result = self.client_get("/json/analytics/chart_data", {"chart_name": "number_of_humans"})
data = self.assert_json_success(result)
self.assert_json_success(result)
data = result.json()
self.assertEqual(data["everyone"], {"_1day": [0], "_15day": [0], "all_time": [0]})
self.assertFalse("user" in data)
@@ -269,7 +275,8 @@ class TestGetChartData(ZulipTestCase):
result = self.client_get(
"/json/analytics/chart_data", {"chart_name": "messages_sent_over_time"}
)
data = self.assert_json_success(result)
self.assert_json_success(result)
data = result.json()
self.assertEqual(data["everyone"], {"human": [0], "bot": [0]})
self.assertEqual(data["user"], {"human": [0], "bot": [0]})
@@ -281,14 +288,15 @@ class TestGetChartData(ZulipTestCase):
result = self.client_get(
"/json/analytics/chart_data", {"chart_name": "messages_sent_by_message_type"}
)
data = self.assert_json_success(result)
self.assert_json_success(result)
data = result.json()
self.assertEqual(
data["everyone"],
{
"Public streams": [0],
"Private streams": [0],
"Direct messages": [0],
"Group direct messages": [0],
"Private messages": [0],
"Group private messages": [0],
},
)
self.assertEqual(
@@ -296,8 +304,8 @@ class TestGetChartData(ZulipTestCase):
{
"Public streams": [0],
"Private streams": [0],
"Direct messages": [0],
"Group direct messages": [0],
"Private messages": [0],
"Group private messages": [0],
},
)
@@ -309,7 +317,8 @@ class TestGetChartData(ZulipTestCase):
result = self.client_get(
"/json/analytics/chart_data", {"chart_name": "messages_sent_by_client"}
)
data = self.assert_json_success(result)
self.assert_json_success(result)
data = result.json()
self.assertEqual(data["everyone"], {})
self.assertEqual(data["user"], {})
@@ -331,7 +340,8 @@ class TestGetChartData(ZulipTestCase):
"end": end_time_timestamps[2],
},
)
data = self.assert_json_success(result)
self.assert_json_success(result)
data = result.json()
self.assertEqual(data["end_times"], end_time_timestamps[1:3])
self.assertEqual(
data["everyone"], {"_1day": [0, 100], "_15day": [0, 100], "all_time": [0, 100]}
@@ -359,7 +369,8 @@ class TestGetChartData(ZulipTestCase):
result = self.client_get(
"/json/analytics/chart_data", {"chart_name": "number_of_humans", "min_length": 2}
)
data = self.assert_json_success(result)
self.assert_json_success(result)
data = result.json()
self.assertEqual(
data["end_times"], [datetime_to_timestamp(dt) for dt in self.end_times_day]
)
@@ -371,7 +382,8 @@ class TestGetChartData(ZulipTestCase):
result = self.client_get(
"/json/analytics/chart_data", {"chart_name": "number_of_humans", "min_length": 5}
)
data = self.assert_json_success(result)
self.assert_json_success(result)
data = result.json()
end_times = [
ceiling_to_day(self.realm.date_created) + timedelta(days=i) for i in range(-1, 4)
]
@@ -609,7 +621,6 @@ class TestMapArrays(ZulipTestCase):
"SomethingRandom": [4, 5, 6],
"ZulipGitHubWebhook": [7, 7, 9],
"ZulipAndroid": [64, 63, 65],
"ZulipTerminal": [9, 10, 11],
}
result = rewrite_client_arrays(a)
self.assertEqual(
@@ -619,11 +630,10 @@ class TestMapArrays(ZulipTestCase):
"Old iOS app": [1, 2, 3],
"Desktop app": [2, 5, 7],
"Mobile app": [1, 5, 7],
"Web app": [1, 2, 3],
"Website": [1, 2, 3],
"Python API": [2, 4, 6],
"SomethingRandom": [4, 5, 6],
"GitHub webhook": [7, 7, 9],
"Old Android app": [64, 63, 65],
"Terminal app": [9, 10, 11],
},
)

View File

@@ -1,17 +1,16 @@
from datetime import datetime, timedelta, timezone
from typing import TYPE_CHECKING, Optional
from unittest import mock
import orjson
from django.http import HttpResponse
from django.utils.timezone import now as timezone_now
from corporate.lib.stripe import add_months, update_sponsorship_status
from corporate.models import Customer, CustomerPlan, LicenseLedger, get_customer_by_realm
from zerver.actions.invites import do_create_multiuse_invite_link
from zerver.actions.realm_settings import do_change_realm_org_type, do_send_realm_reactivation_email
from zerver.actions.user_settings import do_change_user_setting
from zerver.actions.realm_settings import do_send_realm_reactivation_email, do_set_realm_property
from zerver.lib.test_classes import ZulipTestCase
from zerver.lib.test_helpers import reset_email_visibility_to_everyone_in_zulip_realm
from zerver.lib.test_helpers import reset_emails_in_zulip_realm
from zerver.models import (
MultiuseInvite,
PreregistrationUser,
@@ -22,20 +21,13 @@ from zerver.models import (
get_realm,
)
if TYPE_CHECKING:
from django.test.client import _MonkeyPatchedWSGIResponse as TestHttpResponse
class TestSupportEndpoint(ZulipTestCase):
def test_search(self) -> None:
reset_email_visibility_to_everyone_in_zulip_realm()
lear_user = self.lear_user("king")
lear_user.is_staff = True
lear_user.save(update_fields=["is_staff"])
lear_realm = get_realm("lear")
reset_emails_in_zulip_realm()
def assert_user_details_in_html_response(
html_response: "TestHttpResponse", full_name: str, email: str, role: str
html_response: HttpResponse, full_name: str, email: str, role: str
) -> None:
self.assert_in_success_response(
[
@@ -48,22 +40,7 @@ class TestSupportEndpoint(ZulipTestCase):
html_response,
)
def create_invitation(
stream: str, invitee_email: str, realm: Optional[Realm] = None
) -> None:
invite_expires_in_minutes = 10 * 24 * 60
self.client_post(
"/json/invites",
{
"invitee_emails": [invitee_email],
"stream_ids": orjson.dumps([self.get_stream_id(stream, realm)]).decode(),
"invite_expires_in_minutes": invite_expires_in_minutes,
"invite_as": PreregistrationUser.INVITE_AS["MEMBER"],
},
subdomain=realm.string_id if realm is not None else "zulip",
)
def check_hamlet_user_query_result(result: "TestHttpResponse") -> None:
def check_hamlet_user_query_result(result: HttpResponse) -> None:
assert_user_details_in_html_response(
result, "King Hamlet", self.example_email("hamlet"), "Member"
)
@@ -79,22 +56,17 @@ class TestSupportEndpoint(ZulipTestCase):
result,
)
def check_lear_user_query_result(result: "TestHttpResponse") -> None:
assert_user_details_in_html_response(
result, lear_user.full_name, lear_user.email, "Member"
)
def check_othello_user_query_result(result: "TestHttpResponse") -> None:
def check_othello_user_query_result(result: HttpResponse) -> None:
assert_user_details_in_html_response(
result, "Othello, the Moor of Venice", self.example_email("othello"), "Member"
)
def check_polonius_user_query_result(result: "TestHttpResponse") -> None:
def check_polonius_user_query_result(result: HttpResponse) -> None:
assert_user_details_in_html_response(
result, "Polonius", self.example_email("polonius"), "Guest"
)
def check_zulip_realm_query_result(result: "TestHttpResponse") -> None:
def check_zulip_realm_query_result(result: HttpResponse) -> None:
zulip_realm = get_realm("zulip")
first_human_user = zulip_realm.get_first_human_user()
assert first_human_user is not None
@@ -115,7 +87,8 @@ class TestSupportEndpoint(ZulipTestCase):
result,
)
def check_lear_realm_query_result(result: "TestHttpResponse") -> None:
def check_lear_realm_query_result(result: HttpResponse) -> None:
lear_realm = get_realm("lear")
self.assert_in_success_response(
[
f'<input type="hidden" name="realm_id" value="{lear_realm.id}"',
@@ -140,7 +113,7 @@ class TestSupportEndpoint(ZulipTestCase):
)
def check_preregistration_user_query_result(
result: "TestHttpResponse", email: str, invite: bool = False
result: HttpResponse, email: str, invite: bool = False
) -> None:
self.assert_in_success_response(
[
@@ -154,7 +127,7 @@ class TestSupportEndpoint(ZulipTestCase):
self.assert_in_success_response(
[
"<b>Expires in</b>: 1\xa0week, 3\xa0days",
"<b>Status</b>: Link has not been used",
"<b>Status</b>: Link has never been clicked",
],
result,
)
@@ -164,12 +137,12 @@ class TestSupportEndpoint(ZulipTestCase):
self.assert_in_success_response(
[
"<b>Expires in</b>: 1\xa0day",
"<b>Status</b>: Link has not been used",
"<b>Status</b>: Link has never been clicked",
],
result,
)
def check_realm_creation_query_result(result: "TestHttpResponse", email: str) -> None:
def check_realm_creation_query_result(result: HttpResponse, email: str) -> None:
self.assert_in_success_response(
[
'<span class="label">preregistration user</span>\n',
@@ -180,7 +153,7 @@ class TestSupportEndpoint(ZulipTestCase):
result,
)
def check_multiuse_invite_link_query_result(result: "TestHttpResponse") -> None:
def check_multiuse_invite_link_query_result(result: HttpResponse) -> None:
self.assert_in_success_response(
[
'<span class="label">multiuse invite</span>\n',
@@ -190,7 +163,7 @@ class TestSupportEndpoint(ZulipTestCase):
result,
)
def check_realm_reactivation_link_query_result(result: "TestHttpResponse") -> None:
def check_realm_reactivation_link_query_result(result: HttpResponse) -> None:
self.assert_in_success_response(
[
'<span class="label">realm reactivation</span>\n',
@@ -200,13 +173,6 @@ class TestSupportEndpoint(ZulipTestCase):
result,
)
def get_check_query_result(
query: str, count: int, subdomain: str = "zulip"
) -> "TestHttpResponse":
result = self.client_get("/activity/support", {"q": query}, subdomain=subdomain)
self.assertEqual(result.content.decode().count("support-query-result"), count)
return result
self.login("cordelia")
result = self.client_get("/activity/support")
@@ -215,14 +181,14 @@ class TestSupportEndpoint(ZulipTestCase):
self.login("iago")
do_change_user_setting(
self.example_user("hamlet"),
do_set_realm_property(
get_realm("zulip"),
"email_address_visibility",
UserProfile.EMAIL_ADDRESS_VISIBILITY_NOBODY,
Realm.EMAIL_ADDRESS_VISIBILITY_NOBODY,
acting_user=None,
)
customer = Customer.objects.create(realm=lear_realm, stripe_customer_id="cus_123")
customer = Customer.objects.create(realm=get_realm("lear"), stripe_customer_id="cus_123")
now = datetime(2016, 1, 2, tzinfo=timezone.utc)
plan = CustomerPlan.objects.create(
customer=customer,
@@ -245,49 +211,39 @@ class TestSupportEndpoint(ZulipTestCase):
['<input type="text" name="q" class="input-xxlarge search-query"'], result
)
result = get_check_query_result(self.example_email("hamlet"), 1)
result = self.client_get("/activity/support", {"q": self.example_email("hamlet")})
check_hamlet_user_query_result(result)
check_zulip_realm_query_result(result)
# Search should be case-insensitive:
assert self.example_email("hamlet") != self.example_email("hamlet").upper()
result = get_check_query_result(self.example_email("hamlet").upper(), 1)
check_hamlet_user_query_result(result)
check_zulip_realm_query_result(result)
result = get_check_query_result(lear_user.email, 1)
check_lear_user_query_result(result)
check_lear_realm_query_result(result)
result = get_check_query_result(self.example_email("polonius"), 1)
result = self.client_get("/activity/support", {"q": self.example_email("polonius")})
check_polonius_user_query_result(result)
check_zulip_realm_query_result(result)
result = get_check_query_result("lear", 1)
result = self.client_get("/activity/support", {"q": "lear"})
check_lear_realm_query_result(result)
result = get_check_query_result("http://lear.testserver", 1)
result = self.client_get("/activity/support", {"q": "http://lear.testserver"})
check_lear_realm_query_result(result)
with self.settings(REALM_HOSTS={"zulip": "localhost"}):
result = get_check_query_result("http://localhost", 1)
result = self.client_get("/activity/support", {"q": "http://localhost"})
check_zulip_realm_query_result(result)
result = get_check_query_result("hamlet@zulip.com, lear", 2)
result = self.client_get("/activity/support", {"q": "hamlet@zulip.com, lear"})
check_hamlet_user_query_result(result)
check_zulip_realm_query_result(result)
check_lear_realm_query_result(result)
result = get_check_query_result("King hamlet,lear", 2)
result = self.client_get("/activity/support", {"q": "King hamlet,lear"})
check_hamlet_user_query_result(result)
check_zulip_realm_query_result(result)
check_lear_realm_query_result(result)
result = get_check_query_result("Othello, the Moor of Venice", 1)
result = self.client_get("/activity/support", {"q": "Othello, the Moor of Venice"})
check_othello_user_query_result(result)
check_zulip_realm_query_result(result)
result = get_check_query_result("lear, Hamlet <hamlet@zulip.com>", 2)
result = self.client_get("/activity/support", {"q": "lear, Hamlet <hamlet@zulip.com>"})
check_hamlet_user_query_result(result)
check_zulip_realm_query_result(result)
check_lear_realm_query_result(result)
@@ -298,76 +254,50 @@ class TestSupportEndpoint(ZulipTestCase):
):
self.client_post("/accounts/home/", {"email": self.nonreg_email("test")})
self.login("iago")
result = get_check_query_result(self.nonreg_email("test"), 1)
result = self.client_get("/activity/support", {"q": self.nonreg_email("test")})
check_preregistration_user_query_result(result, self.nonreg_email("test"))
check_zulip_realm_query_result(result)
create_invitation("Denmark", self.nonreg_email("test1"))
result = get_check_query_result(self.nonreg_email("test1"), 1)
invite_expires_in_days = 10
stream_ids = [self.get_stream_id("Denmark")]
invitee_emails = [self.nonreg_email("test1")]
self.client_post(
"/json/invites",
{
"invitee_emails": invitee_emails,
"stream_ids": orjson.dumps(stream_ids).decode(),
"invite_expires_in_days": invite_expires_in_days,
"invite_as": PreregistrationUser.INVITE_AS["MEMBER"],
},
)
result = self.client_get("/activity/support", {"q": self.nonreg_email("test1")})
check_preregistration_user_query_result(result, self.nonreg_email("test1"), invite=True)
check_zulip_realm_query_result(result)
email = self.nonreg_email("alice")
self.submit_realm_creation_form(
email, realm_subdomain="zuliptest", realm_name="Zulip test"
)
result = get_check_query_result(email, 1)
self.client_post("/new/", {"email": email})
result = self.client_get("/activity/support", {"q": email})
check_realm_creation_query_result(result, email)
invite_expires_in_minutes = 10 * 24 * 60
do_create_multiuse_invite_link(
self.example_user("hamlet"),
invited_as=1,
invite_expires_in_minutes=invite_expires_in_minutes,
invite_expires_in_days=invite_expires_in_days,
)
result = get_check_query_result("zulip", 2)
result = self.client_get("/activity/support", {"q": "zulip"})
check_multiuse_invite_link_query_result(result)
check_zulip_realm_query_result(result)
MultiuseInvite.objects.all().delete()
do_send_realm_reactivation_email(get_realm("zulip"), acting_user=None)
result = get_check_query_result("zulip", 2)
result = self.client_get("/activity/support", {"q": "zulip"})
check_realm_reactivation_link_query_result(result)
check_zulip_realm_query_result(result)
lear_nonreg_email = "newguy@lear.org"
self.client_post("/accounts/home/", {"email": lear_nonreg_email}, subdomain="lear")
result = get_check_query_result(lear_nonreg_email, 1)
check_preregistration_user_query_result(result, lear_nonreg_email)
check_lear_realm_query_result(result)
self.login_user(lear_user)
create_invitation("general", "newguy2@lear.org", lear_realm)
result = get_check_query_result("newguy2@lear.org", 1, lear_realm.string_id)
check_preregistration_user_query_result(result, "newguy2@lear.org", invite=True)
check_lear_realm_query_result(result)
def test_get_org_type_display_name(self) -> None:
self.assertEqual(get_org_type_display_name(Realm.ORG_TYPES["business"]["id"]), "Business")
self.assertEqual(get_org_type_display_name(883), "")
def test_unspecified_org_type_correctly_displayed(self) -> None:
"""
Unspecified org type is special in that it is marked to not be shown
on the registration page (because organitions are not meant to be able to choose it),
but should be correctly shown at the /support/ endpoint.
"""
realm = get_realm("zulip")
do_change_realm_org_type(realm, 0, acting_user=None)
self.assertEqual(realm.org_type, 0)
self.login("iago")
result = self.client_get("/activity/support", {"q": "zulip"}, subdomain="zulip")
self.assert_in_success_response(
[
f'<input type="hidden" name="realm_id" value="{realm.id}"',
'<option value="0" selected>',
],
result,
)
@mock.patch("analytics.views.support.update_billing_method_of_current_plan")
def test_change_billing_method(self, m: mock.Mock) -> None:
cordelia = self.example_user("cordelia")
@@ -633,7 +563,7 @@ class TestSupportEndpoint(ZulipTestCase):
"/activity/support",
{
"realm_id": f"{iago.realm_id}",
"modify_plan": "downgrade_at_billing_cycle_end",
"downgrade_method": "downgrade_at_billing_cycle_end",
},
)
m.assert_called_once_with(get_realm("zulip"))
@@ -648,7 +578,7 @@ class TestSupportEndpoint(ZulipTestCase):
"/activity/support",
{
"realm_id": f"{iago.realm_id}",
"modify_plan": "downgrade_now_without_additional_licenses",
"downgrade_method": "downgrade_now_without_additional_licenses",
},
)
m.assert_called_once_with(get_realm("zulip"))
@@ -664,7 +594,7 @@ class TestSupportEndpoint(ZulipTestCase):
"/activity/support",
{
"realm_id": f"{iago.realm_id}",
"modify_plan": "downgrade_now_void_open_invoices",
"downgrade_method": "downgrade_now_void_open_invoices",
},
)
m1.assert_called_once_with(get_realm("zulip"))
@@ -673,17 +603,6 @@ class TestSupportEndpoint(ZulipTestCase):
["zulip downgraded and voided 1 open invoices"], result
)
with mock.patch("analytics.views.support.switch_realm_from_standard_to_plus_plan") as m:
result = self.client_post(
"/activity/support",
{
"realm_id": f"{iago.realm_id}",
"modify_plan": "upgrade_to_plus",
},
)
m.assert_called_once_with(get_realm("zulip"))
self.assert_in_success_response(["zulip upgraded to Plus"], result)
def test_scrub_realm(self) -> None:
cordelia = self.example_user("cordelia")
lear_realm = get_realm("lear")
@@ -708,26 +627,3 @@ class TestSupportEndpoint(ZulipTestCase):
result = self.client_post("/activity/support", {"realm_id": f"{lear_realm.id}"})
self.assert_json_error(result, "Invalid parameters")
m.assert_not_called()
def test_delete_user(self) -> None:
cordelia = self.example_user("cordelia")
hamlet = self.example_user("hamlet")
hamlet_email = hamlet.delivery_email
realm = get_realm("zulip")
self.login_user(cordelia)
result = self.client_post(
"/activity/support", {"realm_id": f"{realm.id}", "delete_user_by_id": hamlet.id}
)
self.assertEqual(result.status_code, 302)
self.assertEqual(result["Location"], "/login/")
self.login("iago")
with mock.patch("analytics.views.support.do_delete_user_preserving_messages") as m:
result = self.client_post(
"/activity/support",
{"realm_id": f"{realm.id}", "delete_user_by_id": hamlet.id},
)
m.assert_called_once_with(hamlet)
self.assert_in_success_response([f"{hamlet_email} in zulip deleted"], result)

View File

@@ -1,24 +1,17 @@
import re
import sys
from datetime import datetime
from typing import Any, Collection, Dict, List, Optional, Sequence
from urllib.parse import urlencode
from html import escape
from typing import Any, Dict, List, Optional, Sequence
import pytz
from django.conf import settings
from django.db.backends.utils import CursorWrapper
from django.db.models.query import QuerySet
from django.template import loader
from django.urls import reverse
from markupsafe import Markup
from markupsafe import Markup as mark_safe
from zerver.lib.url_encoding import append_url_query_string
from zerver.models import UserActivity, get_realm
if sys.version_info < (3, 9): # nocoverage
from backports import zoneinfo
else: # nocoverage
import zoneinfo
eastern_tz = zoneinfo.ZoneInfo("America/New_York")
eastern_tz = pytz.timezone("US/Eastern")
if settings.BILLING_ENABLED:
@@ -28,6 +21,7 @@ if settings.BILLING_ENABLED:
def make_table(
title: str, cols: Sequence[str], rows: Sequence[Any], has_row_class: bool = False
) -> str:
if not has_row_class:
def fix_row(row: Any) -> Dict[str, Any]:
@@ -46,7 +40,7 @@ def make_table(
def dictfetchall(cursor: CursorWrapper) -> List[Dict[str, Any]]:
"""Returns all rows from a cursor as a dict"""
"Returns all rows from a cursor as a dict"
desc = cursor.description
return [dict(zip((col[0] for col in desc), row)) for row in cursor.fetchall()]
@@ -58,55 +52,45 @@ def format_date_for_activity_reports(date: Optional[datetime]) -> str:
return ""
def user_activity_link(email: str, user_profile_id: int) -> Markup:
def user_activity_link(email: str, user_profile_id: int) -> mark_safe:
from analytics.views.user_activity import get_user_activity
url = reverse(get_user_activity, kwargs=dict(user_profile_id=user_profile_id))
return Markup('<a href="{url}">{email}</a>').format(url=url, email=email)
email_link = f'<a href="{escape(url)}">{escape(email)}</a>'
return mark_safe(email_link)
def realm_activity_link(realm_str: str) -> Markup:
def realm_activity_link(realm_str: str) -> mark_safe:
from analytics.views.realm_activity import get_realm_activity
url = reverse(get_realm_activity, kwargs=dict(realm_str=realm_str))
return Markup('<a href="{url}">{realm_str}</a>').format(url=url, realm_str=realm_str)
realm_link = f'<a href="{escape(url)}">{escape(realm_str)}</a>'
return mark_safe(realm_link)
def realm_stats_link(realm_str: str) -> Markup:
def realm_stats_link(realm_str: str) -> mark_safe:
from analytics.views.stats import stats_for_realm
url = reverse(stats_for_realm, kwargs=dict(realm_str=realm_str))
return Markup('<a href="{url}"><i class="fa fa-pie-chart"></i></a>').format(url=url)
stats_link = f'<a href="{escape(url)}"><i class="fa fa-pie-chart"></i>{escape(realm_str)}</a>'
return mark_safe(stats_link)
def realm_support_link(realm_str: str) -> Markup:
support_url = reverse("support")
query = urlencode({"q": realm_str})
url = append_url_query_string(support_url, query)
return Markup('<a href="{url}">{realm_str}</a>').format(url=url, realm_str=realm_str)
def realm_url_link(realm_str: str) -> Markup:
url = get_realm(realm_str).uri
return Markup('<a href="{url}"><i class="fa fa-home"></i></a>').format(url=url)
def remote_installation_stats_link(server_id: int, hostname: str) -> Markup:
def remote_installation_stats_link(server_id: int, hostname: str) -> mark_safe:
from analytics.views.stats import stats_for_remote_installation
url = reverse(stats_for_remote_installation, kwargs=dict(remote_server_id=server_id))
return Markup('<a href="{url}"><i class="fa fa-pie-chart"></i>{hostname}</a>').format(
url=url, hostname=hostname
)
stats_link = f'<a href="{escape(url)}"><i class="fa fa-pie-chart"></i>{escape(hostname)}</a>'
return mark_safe(stats_link)
def get_user_activity_summary(records: Collection[UserActivity]) -> Dict[str, Any]:
def get_user_activity_summary(records: List[QuerySet]) -> Dict[str, Any]:
#: The type annotation used above is clearly overly permissive.
#: We should perhaps use TypedDict to clearly lay out the schema
#: for the user activity summary.
summary: Dict[str, Any] = {}
def update(action: str, record: UserActivity) -> None:
def update(action: str, record: QuerySet) -> None:
if action not in summary:
summary[action] = dict(
count=record.count,
@@ -120,9 +104,8 @@ def get_user_activity_summary(records: Collection[UserActivity]) -> Dict[str, An
)
if records:
first_record = next(iter(records))
summary["name"] = first_record.user_profile.full_name
summary["user_profile_id"] = first_record.user_profile.id
summary["name"] = records[0].user_profile.full_name
summary["user_profile_id"] = records[0].user_profile.id
for record in records:
client = record.client.name

View File

@@ -1,7 +1,6 @@
import itertools
import time
from collections import defaultdict
from contextlib import suppress
from datetime import datetime, timedelta
from typing import Callable, Dict, List, Optional, Sequence, Tuple, Union
@@ -11,7 +10,7 @@ from django.http import HttpRequest, HttpResponse
from django.shortcuts import render
from django.template import loader
from django.utils.timezone import now as timezone_now
from markupsafe import Markup
from markupsafe import Markup as mark_safe
from psycopg2.sql import SQL, Composable, Literal
from analytics.lib.counts import COUNT_STATS
@@ -21,15 +20,13 @@ from analytics.views.activity_common import (
make_table,
realm_activity_link,
realm_stats_link,
realm_support_link,
realm_url_link,
remote_installation_stats_link,
)
from analytics.views.support import get_plan_name
from zerver.decorator import require_server_admin
from zerver.lib.request import has_request_variables
from zerver.lib.timestamp import timestamp_to_datetime
from zerver.models import Realm, UserActivityInterval, get_org_type_display_name
from zerver.models import Realm, UserActivityInterval, UserProfile, get_org_type_display_name
if settings.BILLING_ENABLED:
from corporate.lib.stripe import (
@@ -38,7 +35,7 @@ if settings.BILLING_ENABLED:
)
def get_realm_day_counts() -> Dict[str, Dict[str, Markup]]:
def get_realm_day_counts() -> Dict[str, Dict[str, str]]:
query = SQL(
"""
select
@@ -78,7 +75,7 @@ def get_realm_day_counts() -> Dict[str, Dict[str, Markup]]:
min_cnt = min(raw_cnts[1:])
max_cnt = max(raw_cnts[1:])
def format_count(cnt: int, style: Optional[str] = None) -> Markup:
def format_count(cnt: int, style: Optional[str] = None) -> str:
if style is not None:
good_bad = style
elif cnt == min_cnt:
@@ -88,11 +85,9 @@ def get_realm_day_counts() -> Dict[str, Dict[str, Markup]]:
else:
good_bad = "neutral"
return Markup('<td class="number {good_bad}">{cnt}</td>').format(
good_bad=good_bad, cnt=cnt
)
return f'<td class="number {good_bad}">{cnt}</td>'
cnts = format_count(raw_cnts[0], "neutral") + Markup().join(map(format_count, raw_cnts[1:]))
cnts = format_count(raw_cnts[0], "neutral") + "".join(map(format_count, raw_cnts[1:]))
result[string_id] = dict(cnts=cnts)
return result
@@ -192,10 +187,19 @@ def realm_summary_table(realm_minutes: Dict[str, float]) -> str:
rows = dictfetchall(cursor)
cursor.close()
# Fetch all the realm administrator users
realm_owners: Dict[str, List[str]] = defaultdict(list)
for up in UserProfile.objects.select_related("realm").filter(
role=UserProfile.ROLE_REALM_OWNER,
is_active=True,
):
realm_owners[up.realm.string_id].append(up.delivery_email)
for row in rows:
row["date_created_day"] = row["date_created"].strftime("%Y-%m-%d")
row["age_days"] = int((now - row["date_created"]).total_seconds() / 86400)
row["is_new"] = row["age_days"] < 12 * 7
row["realm_owner_emails"] = ", ".join(realm_owners[row["string_id"]])
# get messages sent per day
counts = get_realm_day_counts()
@@ -244,14 +248,14 @@ def realm_summary_table(realm_minutes: Dict[str, float]) -> str:
hours = minutes / 60.0
total_hours += hours
row["hours"] = str(int(hours))
with suppress(Exception):
try:
row["hours_per_user"] = "{:.1f}".format(hours / row["dau_count"])
except Exception:
pass
# formatting
for row in rows:
row["realm_url"] = realm_url_link(row["string_id"])
row["stats_link"] = realm_stats_link(row["string_id"])
row["support_link"] = realm_support_link(row["string_id"])
row["string_id"] = realm_activity_link(row["string_id"])
# Count active sites
@@ -277,10 +281,9 @@ def realm_summary_table(realm_minutes: Dict[str, float]) -> str:
org_type_string="",
effective_rate="",
arr=total_arr,
realm_url="",
stats_link="",
support_link="",
date_created_day="",
realm_owner_emails="",
dau_count=total_dau_count,
user_profile_count=total_user_profile_count,
bot_count=total_bot_count,
@@ -295,19 +298,18 @@ def realm_summary_table(realm_minutes: Dict[str, float]) -> str:
dict(
rows=rows,
num_active_sites=num_active_sites,
utctime=now.strftime("%Y-%m-%d %H:%M %Z"),
utctime=now.strftime("%Y-%m-%d %H:%MZ"),
billing_enabled=settings.BILLING_ENABLED,
),
)
return content
def user_activity_intervals() -> Tuple[Markup, Dict[str, float]]:
def user_activity_intervals() -> Tuple[mark_safe, Dict[str, float]]:
day_end = timestamp_to_datetime(time.time())
day_start = day_end - timedelta(hours=24)
output = Markup()
output += "Per-user online duration for the last 24 hours:\n"
output = "Per-user online duration for the last 24 hours:\n"
total_duration = timedelta(0)
all_intervals = (
@@ -338,7 +340,7 @@ def user_activity_intervals() -> Tuple[Markup, Dict[str, float]]:
for string_id, realm_intervals in itertools.groupby(all_intervals, by_string_id):
realm_duration = timedelta(0)
output += Markup("<hr>") + f"{string_id}\n"
output += f"<hr>{string_id}\n"
for email, intervals in itertools.groupby(realm_intervals, by_email):
duration = timedelta(0)
for interval in intervals:
@@ -355,7 +357,7 @@ def user_activity_intervals() -> Tuple[Markup, Dict[str, float]]:
output += f"\nTotal duration: {total_duration}\n"
output += f"\nTotal duration in minutes: {total_duration.total_seconds() / 60.}\n"
output += f"Total duration amortized to a month: {total_duration.total_seconds() * 30. / 60.}"
content = Markup("<pre>{}</pre>").format(output)
content = mark_safe("<pre>" + output + "</pre>")
return content, realm_minutes
@@ -370,7 +372,7 @@ def ad_hoc_queries() -> List[Dict[str, str]]:
cursor.close()
def fix_rows(
i: int, fixup_func: Union[Callable[[str], Markup], Callable[[datetime], str]]
i: int, fixup_func: Union[Callable[[str], mark_safe], Callable[[datetime], str]]
) -> None:
for row in rows:
row[i] = fixup_func(row[i])
@@ -408,7 +410,7 @@ def ad_hoc_queries() -> List[Dict[str, str]]:
for mobile_type in ["Android", "ZulipiOS"]:
title = f"{mobile_type} usage"
query: Composable = SQL(
query = SQL(
"""
select
realm.string_id,

View File

@@ -3,7 +3,7 @@ from datetime import datetime
from typing import Any, Dict, List, Optional, Set, Tuple
from django.db import connection
from django.db.models import QuerySet
from django.db.models.query import QuerySet
from django.http import HttpRequest, HttpResponse, HttpResponseNotFound
from django.shortcuts import render
from django.utils.timezone import now as timezone_now
@@ -13,14 +13,13 @@ from analytics.views.activity_common import (
format_date_for_activity_reports,
get_user_activity_summary,
make_table,
realm_stats_link,
user_activity_link,
)
from zerver.decorator import require_server_admin
from zerver.models import Realm, UserActivity
def get_user_activity_records_for_realm(realm: str, is_bot: bool) -> QuerySet[UserActivity]:
def get_user_activity_records_for_realm(realm: str, is_bot: bool) -> QuerySet:
fields = [
"user_profile__full_name",
"user_profile__delivery_email",
@@ -41,11 +40,11 @@ def get_user_activity_records_for_realm(realm: str, is_bot: bool) -> QuerySet[Us
def realm_user_summary_table(
all_records: QuerySet[UserActivity], admin_emails: Set[str]
all_records: List[QuerySet], admin_emails: Set[str]
) -> Tuple[Dict[str, Any], str]:
user_records = {}
def by_email(record: UserActivity) -> str:
def by_email(record: QuerySet) -> str:
return record.user_profile.delivery_email
for email, records in itertools.groupby(all_records, by_email):
@@ -237,7 +236,7 @@ def get_realm_activity(request: HttpRequest, realm_str: str) -> HttpResponse:
admin_emails = {admin.delivery_email for admin in admins}
for is_bot, page_title in [(False, "Humans"), (True, "Bots")]:
all_records = get_user_activity_records_for_realm(realm_str, is_bot)
all_records = list(get_user_activity_records_for_realm(realm_str, is_bot))
user_records, content = realm_user_summary_table(all_records, admin_emails)
all_user_records.update(user_records)
@@ -253,10 +252,8 @@ def get_realm_activity(request: HttpRequest, realm_str: str) -> HttpResponse:
data += [(page_title, content)]
title = realm_str
realm_stats = realm_stats_link(realm_str)
return render(
request,
"analytics/activity.html",
context=dict(data=data, realm_stats_link=realm_stats, title=title),
context=dict(data=data, realm_link=None, title=title),
)

View File

@@ -1,10 +1,10 @@
import logging
from collections import defaultdict
from datetime import datetime, timedelta, timezone
from typing import Any, Dict, List, Optional, Tuple, Type, TypeVar, Union, cast
from typing import Any, Dict, List, Optional, Tuple, Type, Union, cast
from django.conf import settings
from django.db.models import QuerySet
from django.db.models.query import QuerySet
from django.http import HttpRequest, HttpResponse, HttpResponseNotFound
from django.shortcuts import render
from django.utils import translation
@@ -49,36 +49,16 @@ def is_analytics_ready(realm: Realm) -> bool:
def render_stats(
request: HttpRequest,
data_url_suffix: str,
realm: Optional[Realm],
*,
title: Optional[str] = None,
target_name: str,
for_installation: bool = False,
remote: bool = False,
analytics_ready: bool = True,
) -> HttpResponse:
assert request.user.is_authenticated
if realm is not None:
# Same query to get guest user count as in get_seat_count in corporate/lib/stripe.py.
guest_users = UserProfile.objects.filter(
realm=realm, is_active=True, is_bot=False, role=UserProfile.ROLE_GUEST
).count()
space_used = realm.currently_used_upload_space_bytes()
if title:
pass
else:
title = realm.name or realm.string_id
else:
assert title
guest_users = None
space_used = None
page_params = dict(
data_url_suffix=data_url_suffix,
for_installation=for_installation,
remote=remote,
upload_space_used=space_used,
guest_users=guest_users,
)
request_language = get_and_set_request_language(
@@ -93,9 +73,7 @@ def render_stats(
request,
"analytics/stats.html",
context=dict(
target_name=title,
page_params=page_params,
analytics_ready=analytics_ready,
target_name=target_name, page_params=page_params, analytics_ready=analytics_ready
),
)
@@ -108,7 +86,9 @@ def stats(request: HttpRequest) -> HttpResponse:
# TODO: Make @zulip_login_required pass the UserProfile so we
# can use @require_member_or_admin
raise JsonableError(_("Not allowed for guest users"))
return render_stats(request, "", realm, analytics_ready=is_analytics_ready(realm))
return render_stats(
request, "", realm.name or realm.string_id, analytics_ready=is_analytics_ready(realm)
)
@require_server_admin
@@ -122,7 +102,7 @@ def stats_for_realm(request: HttpRequest, realm_str: str) -> HttpResponse:
return render_stats(
request,
f"/realm/{realm_str}",
realm,
realm.name or realm.string_id,
analytics_ready=is_analytics_ready(realm),
)
@@ -137,29 +117,27 @@ def stats_for_remote_realm(
return render_stats(
request,
f"/remote/{server.id}/realm/{remote_realm_id}",
None,
title=f"Realm {remote_realm_id} on server {server.hostname}",
f"Realm {remote_realm_id} on server {server.hostname}",
)
@require_server_admin_api
@has_request_variables
def get_chart_data_for_realm(
request: HttpRequest, /, user_profile: UserProfile, realm_str: str, **kwargs: Any
request: HttpRequest, user_profile: UserProfile, realm_str: str, **kwargs: Any
) -> HttpResponse:
try:
realm = get_realm(realm_str)
except Realm.DoesNotExist:
raise JsonableError(_("Invalid organization"))
return get_chart_data(request, user_profile, realm=realm, **kwargs)
return get_chart_data(request=request, user_profile=user_profile, realm=realm, **kwargs)
@require_server_admin_api
@has_request_variables
def get_chart_data_for_remote_realm(
request: HttpRequest,
/,
user_profile: UserProfile,
remote_server_id: int,
remote_realm_id: int,
@@ -168,8 +146,8 @@ def get_chart_data_for_remote_realm(
assert settings.ZILENCER_ENABLED
server = RemoteZulipServer.objects.get(id=remote_server_id)
return get_chart_data(
request,
user_profile,
request=request,
user_profile=user_profile,
server=server,
remote=True,
remote_realm_id=int(remote_realm_id),
@@ -179,8 +157,7 @@ def get_chart_data_for_remote_realm(
@require_server_admin
def stats_for_installation(request: HttpRequest) -> HttpResponse:
assert request.user.is_authenticated
return render_stats(request, "/installation", None, title="installation", for_installation=True)
return render_stats(request, "/installation", "installation", True)
@require_server_admin
@@ -190,26 +167,26 @@ def stats_for_remote_installation(request: HttpRequest, remote_server_id: int) -
return render_stats(
request,
f"/remote/{server.id}/installation",
None,
title=f"remote installation {server.hostname}",
for_installation=True,
remote=True,
f"remote installation {server.hostname}",
True,
True,
)
@require_server_admin_api
@has_request_variables
def get_chart_data_for_installation(
request: HttpRequest, /, user_profile: UserProfile, chart_name: str = REQ(), **kwargs: Any
request: HttpRequest, user_profile: UserProfile, chart_name: str = REQ(), **kwargs: Any
) -> HttpResponse:
return get_chart_data(request, user_profile, for_installation=True, **kwargs)
return get_chart_data(
request=request, user_profile=user_profile, for_installation=True, **kwargs
)
@require_server_admin_api
@has_request_variables
def get_chart_data_for_remote_installation(
request: HttpRequest,
/,
user_profile: UserProfile,
remote_server_id: int,
chart_name: str = REQ(),
@@ -218,8 +195,8 @@ def get_chart_data_for_remote_installation(
assert settings.ZILENCER_ENABLED
server = RemoteZulipServer.objects.get(id=remote_server_id)
return get_chart_data(
request,
user_profile,
request=request,
user_profile=user_profile,
for_installation=True,
remote=True,
server=server,
@@ -293,8 +270,8 @@ def get_chart_data(
stats[0]: {
"public_stream": _("Public streams"),
"private_stream": _("Private streams"),
"private_message": _("Direct messages"),
"huddle_message": _("Group direct messages"),
"private_message": _("Private messages"),
"huddle_message": _("Group private messages"),
}
}
labels_sort_function = lambda data: sort_by_totals(data["everyone"])
@@ -459,35 +436,30 @@ def sort_client_labels(data: Dict[str, Dict[str, List[int]]]) -> List[str]:
return [label for label, sort_value in sorted(label_sort_values.items(), key=lambda x: x[1])]
CountT = TypeVar("CountT", bound=BaseCount)
def table_filtered_to_id(table: Type[CountT], key_id: int) -> QuerySet[CountT]:
def table_filtered_to_id(table: Type[BaseCount], key_id: int) -> QuerySet:
if table == RealmCount:
return table.objects.filter(realm_id=key_id)
return RealmCount.objects.filter(realm_id=key_id)
elif table == UserCount:
return table.objects.filter(user_id=key_id)
return UserCount.objects.filter(user_id=key_id)
elif table == StreamCount:
return table.objects.filter(stream_id=key_id)
return StreamCount.objects.filter(stream_id=key_id)
elif table == InstallationCount:
return table.objects.all()
return InstallationCount.objects.all()
elif settings.ZILENCER_ENABLED and table == RemoteInstallationCount:
return table.objects.filter(server_id=key_id)
return RemoteInstallationCount.objects.filter(server_id=key_id)
elif settings.ZILENCER_ENABLED and table == RemoteRealmCount:
return table.objects.filter(realm_id=key_id)
return RemoteRealmCount.objects.filter(realm_id=key_id)
else:
raise AssertionError(f"Unknown table: {table}")
def client_label_map(name: str) -> str:
if name == "website":
return "Web app"
return "Website"
if name.startswith("desktop app"):
return "Old desktop app"
if name == "ZulipElectron":
return "Desktop app"
if name == "ZulipTerminal":
return "Terminal app"
if name == "ZulipAndroid":
return "Old Android app"
if name == "ZulipiOS":

View File

@@ -1,15 +1,12 @@
import urllib
from contextlib import suppress
from dataclasses import dataclass
from datetime import timedelta
from decimal import Decimal
from typing import Any, Dict, Iterable, List, Optional
from typing import Any, Dict, List, Optional
from urllib.parse import urlencode
from django.conf import settings
from django.core.exceptions import ValidationError
from django.core.validators import URLValidator
from django.db.models import Q
from django.http import HttpRequest, HttpResponse, HttpResponseRedirect
from django.shortcuts import render
from django.urls import reverse
@@ -18,7 +15,7 @@ from django.utils.timezone import now as timezone_now
from django.utils.translation import gettext as _
from confirmation.models import Confirmation, confirmation_url
from confirmation.settings import STATUS_USED
from confirmation.settings import STATUS_ACTIVE
from zerver.actions.create_realm import do_change_realm_subdomain
from zerver.actions.realm_settings import (
do_change_realm_org_type,
@@ -27,7 +24,6 @@ from zerver.actions.realm_settings import (
do_scrub_realm,
do_send_realm_reactivation_email,
)
from zerver.actions.users import do_delete_user_preserving_messages
from zerver.decorator import require_server_admin
from zerver.forms import check_subdomain_available
from zerver.lib.exceptions import JsonableError
@@ -37,14 +33,11 @@ from zerver.lib.subdomains import get_subdomain_from_hostname
from zerver.lib.validator import check_bool, check_string_in, to_decimal, to_non_negative_int
from zerver.models import (
MultiuseInvite,
PreregistrationRealm,
PreregistrationUser,
Realm,
RealmReactivationStatus,
UserProfile,
get_org_type_display_name,
get_realm,
get_user_profile_by_id,
)
from zerver.views.invite import get_invitee_emails_set
@@ -57,17 +50,11 @@ if settings.BILLING_ENABLED:
get_discount_for_realm,
get_latest_seat_count,
make_end_of_cycle_updates_if_needed,
switch_realm_from_standard_to_plus_plan,
update_billing_method_of_current_plan,
update_sponsorship_status,
void_all_open_invoices,
)
from corporate.models import (
Customer,
CustomerPlan,
get_current_plan_by_realm,
get_customer_by_realm,
)
from corporate.models import get_current_plan_by_realm, get_customer_by_realm
def get_plan_name(plan_type: int) -> str:
@@ -81,7 +68,7 @@ def get_plan_name(plan_type: int) -> str:
def get_confirmations(
types: List[int], object_ids: Iterable[int], hostname: Optional[str] = None
types: List[int], object_ids: List[int], hostname: Optional[str] = None
) -> List[Dict[str, Any]]:
lowest_datetime = timezone_now() - timedelta(days=30)
confirmations = Confirmation.objects.filter(
@@ -97,10 +84,10 @@ def get_confirmations(
assert content_object is not None
if hasattr(content_object, "status"):
if content_object.status == STATUS_USED:
link_status = "Link has been used"
if content_object.status == STATUS_ACTIVE:
link_status = "Link has been clicked"
else:
link_status = "Link has not been used"
link_status = "Link has never been clicked"
else:
link_status = ""
@@ -125,11 +112,10 @@ def get_confirmations(
return confirmation_dicts
VALID_MODIFY_PLAN_METHODS = [
VALID_DOWNGRADE_METHODS = [
"downgrade_at_billing_cycle_end",
"downgrade_now_without_additional_licenses",
"downgrade_now_void_open_invoices",
"upgrade_to_plus",
]
VALID_STATUS_VALUES = [
@@ -143,14 +129,6 @@ VALID_BILLING_METHODS = [
]
@dataclass
class PlanData:
customer: Optional["Customer"] = None
current_plan: Optional["CustomerPlan"] = None
licenses: Optional[int] = None
licenses_used: Optional[int] = None
@require_server_admin
@has_request_variables
def support(
@@ -164,12 +142,11 @@ def support(
default=None, str_validator=check_string_in(VALID_BILLING_METHODS)
),
sponsorship_pending: Optional[bool] = REQ(default=None, json_validator=check_bool),
approve_sponsorship: bool = REQ(default=False, json_validator=check_bool),
modify_plan: Optional[str] = REQ(
default=None, str_validator=check_string_in(VALID_MODIFY_PLAN_METHODS)
approve_sponsorship: Optional[bool] = REQ(default=None, json_validator=check_bool),
downgrade_method: Optional[str] = REQ(
default=None, str_validator=check_string_in(VALID_DOWNGRADE_METHODS)
),
scrub_realm: bool = REQ(default=False, json_validator=check_bool),
delete_user_by_id: Optional[int] = REQ(default=None, converter=to_non_negative_int),
scrub_realm: Optional[bool] = REQ(default=None, json_validator=check_bool),
query: Optional[str] = REQ("q", default=None),
org_type: Optional[int] = REQ(default=None, converter=to_non_negative_int),
) -> HttpResponse:
@@ -188,7 +165,6 @@ def support(
if len(keys) != 2:
raise JsonableError(_("Invalid parameters"))
assert realm_id is not None
realm = Realm.objects.get(id=realm_id)
acting_user = request.user
@@ -257,43 +233,31 @@ def support(
elif approve_sponsorship:
do_approve_sponsorship(realm, acting_user=acting_user)
context["success_message"] = f"Sponsorship approved for {realm.string_id}"
elif modify_plan is not None:
if modify_plan == "downgrade_at_billing_cycle_end":
elif downgrade_method is not None:
if downgrade_method == "downgrade_at_billing_cycle_end":
downgrade_at_the_end_of_billing_cycle(realm)
context[
"success_message"
] = f"{realm.string_id} marked for downgrade at the end of billing cycle"
elif modify_plan == "downgrade_now_without_additional_licenses":
elif downgrade_method == "downgrade_now_without_additional_licenses":
downgrade_now_without_creating_additional_invoices(realm)
context[
"success_message"
] = f"{realm.string_id} downgraded without creating additional invoices"
elif modify_plan == "downgrade_now_void_open_invoices":
elif downgrade_method == "downgrade_now_void_open_invoices":
downgrade_now_without_creating_additional_invoices(realm)
voided_invoices_count = void_all_open_invoices(realm)
context[
"success_message"
] = f"{realm.string_id} downgraded and voided {voided_invoices_count} open invoices"
elif modify_plan == "upgrade_to_plus":
switch_realm_from_standard_to_plus_plan(realm)
context["success_message"] = f"{realm.string_id} upgraded to Plus"
elif scrub_realm:
do_scrub_realm(realm, acting_user=acting_user)
context["success_message"] = f"{realm.string_id} scrubbed."
elif delete_user_by_id:
user_profile_for_deletion = get_user_profile_by_id(delete_user_by_id)
user_email = user_profile_for_deletion.delivery_email
assert user_profile_for_deletion.realm == realm
do_delete_user_preserving_messages(user_profile_for_deletion)
context["success_message"] = f"{user_email} in {realm.subdomain} deleted."
if query:
key_words = get_invitee_emails_set(query)
case_insensitive_users_q = Q()
for key_word in key_words:
case_insensitive_users_q |= Q(delivery_email__iexact=key_word)
users = set(UserProfile.objects.filter(case_insensitive_users_q))
users = set(UserProfile.objects.filter(delivery_email__in=key_words))
realms = set(Realm.objects.filter(string_id__in=key_words))
for key_word in key_words:
@@ -305,11 +269,29 @@ def support(
if parse_result.port:
hostname = f"{hostname}:{parse_result.port}"
subdomain = get_subdomain_from_hostname(hostname)
with suppress(Realm.DoesNotExist):
try:
realms.add(get_realm(subdomain))
except Realm.DoesNotExist:
pass
except ValidationError:
users.update(UserProfile.objects.filter(full_name__iexact=key_word))
for realm in realms:
realm.customer = get_customer_by_realm(realm)
current_plan = get_current_plan_by_realm(realm)
if current_plan is not None:
new_plan, last_ledger_entry = make_end_of_cycle_updates_if_needed(
current_plan, timezone_now()
)
if last_ledger_entry is not None:
if new_plan is not None:
realm.current_plan = new_plan
else:
realm.current_plan = current_plan
realm.current_plan.licenses = last_ledger_entry.licenses
realm.current_plan.licenses_used = get_latest_seat_count(realm)
# full_names can have , in them
users.update(UserProfile.objects.filter(full_name__iexact=query))
@@ -318,69 +300,22 @@ def support(
confirmations: List[Dict[str, Any]] = []
preregistration_user_ids = [
user.id for user in PreregistrationUser.objects.filter(email__in=key_words)
]
preregistration_users = PreregistrationUser.objects.filter(email__in=key_words)
confirmations += get_confirmations(
[Confirmation.USER_REGISTRATION, Confirmation.INVITATION],
preregistration_user_ids,
[Confirmation.USER_REGISTRATION, Confirmation.INVITATION, Confirmation.REALM_CREATION],
preregistration_users,
hostname=request.get_host(),
)
preregistration_realm_ids = [
user.id for user in PreregistrationRealm.objects.filter(email__in=key_words)
]
confirmations += get_confirmations(
[Confirmation.REALM_CREATION],
preregistration_realm_ids,
hostname=request.get_host(),
)
multiuse_invites = MultiuseInvite.objects.filter(realm__in=realms)
confirmations += get_confirmations([Confirmation.MULTIUSE_INVITE], multiuse_invites)
multiuse_invite_ids = [
invite.id for invite in MultiuseInvite.objects.filter(realm__in=realms)
]
confirmations += get_confirmations([Confirmation.MULTIUSE_INVITE], multiuse_invite_ids)
realm_reactivation_status_objects = RealmReactivationStatus.objects.filter(realm__in=realms)
confirmations += get_confirmations(
[Confirmation.REALM_REACTIVATION], [obj.id for obj in realm_reactivation_status_objects]
[Confirmation.REALM_REACTIVATION], [realm.id for realm in realms]
)
context["confirmations"] = confirmations
# We want a union of all realms that might appear in the search result,
# but not necessary as a separate result item.
# Therefore, we do not modify the realms object in the context.
all_realms = realms.union(
[
confirmation["object"].realm
for confirmation in confirmations
# For confirmations, we only display realm details when the type is USER_REGISTRATION
# or INVITATION.
if confirmation["type"] in (Confirmation.USER_REGISTRATION, Confirmation.INVITATION)
]
+ [user.realm for user in users]
)
plan_data: Dict[int, PlanData] = {}
for realm in all_realms:
current_plan = get_current_plan_by_realm(realm)
plan_data[realm.id] = PlanData(
customer=get_customer_by_realm(realm),
current_plan=current_plan,
)
if current_plan is not None:
new_plan, last_ledger_entry = make_end_of_cycle_updates_if_needed(
current_plan, timezone_now()
)
if last_ledger_entry is not None:
if new_plan is not None:
plan_data[realm.id].current_plan = new_plan
else:
plan_data[realm.id].current_plan = current_plan
plan_data[realm.id].licenses = last_ledger_entry.licenses
plan_data[realm.id].licenses_used = get_latest_seat_count(realm)
context["plan_data"] = plan_data
def get_realm_owner_emails_as_string(realm: Realm) -> str:
return ", ".join(
realm.get_human_owner_users()

View File

@@ -1,7 +1,7 @@
from typing import Any, Dict, List, Tuple
from django.conf import settings
from django.db.models import QuerySet
from django.db.models.query import QuerySet
from django.http import HttpRequest, HttpResponse
from django.shortcuts import render
@@ -17,9 +17,7 @@ if settings.BILLING_ENABLED:
pass
def get_user_activity_records(
user_profile: UserProfile,
) -> QuerySet[UserActivity]:
def get_user_activity_records(user_profile: UserProfile) -> List[QuerySet]:
fields = [
"user_profile__full_name",
"query",
@@ -36,7 +34,7 @@ def get_user_activity_records(
return records
def raw_user_activity_table(records: QuerySet[UserActivity]) -> str:
def raw_user_activity_table(records: List[QuerySet]) -> str:
cols = [
"query",
"client",
@@ -44,7 +42,7 @@ def raw_user_activity_table(records: QuerySet[UserActivity]) -> str:
"last_visit",
]
def row(record: UserActivity) -> List[Any]:
def row(record: QuerySet) -> List[Any]:
return [
record.query,
record.client.name,

File diff suppressed because it is too large Load Diff

View File

@@ -1,103 +0,0 @@
# Construct a narrow
A **narrow** is a set of filters for Zulip messages, that can be based
on many different factors (like sender, stream, topic, search
keywords, etc.). Narrows are used in various places in the the Zulip
API (most importantly, in the API for fetching messages).
It is simplest to explain the algorithm for encoding a search as a
narrow using a single example. Consider the following search query
(written as it would be entered in the Zulip web app's search box).
It filters for messages sent to stream `announce`, not sent by
`iago@zulip.com`, and containing the words `cool` and `sunglasses`:
```
stream:announce -sender:iago@zulip.com cool sunglasses
```
This query would be JSON-encoded for use in the Zulip API using JSON
as a list of simple objects, as follows:
```json
[
{
"operator": "stream",
"operand": "announce"
},
{
"operator": "sender",
"operand": "iago@zulip.com",
"negated": true
},
{
"operator": "search",
"operand": "cool sunglasses"
}
]
```
The Zulip help center article on [searching for messages](/help/search-for-messages)
documents the majority of the search/narrow options supported by the
Zulip API.
Note that many narrows, including all that lack a `stream` or `streams`
operator, search the current user's personal message history. See
[searching shared history](/help/search-for-messages#searching-shared-history)
for details.
**Changes**: In Zulip 7.0 (feature level 177), support was added
for three filters related to direct messages: `is:dm`, `dm` and
`dm-including`. The `dm` operator replaced and deprecated the
`pm-with` operator. The `is:dm` filter replaced and deprecated
the `is:private` filter. The `dm-including` operator replaced and
deprecated the `group-pm-with` operator.
The `dm-including` and `group-pm-with` operators return slightly
different results. For example, `dm-including:1234` returns all
direct messages (1-on-1 and group) that include the current user
and the user with the unique user ID of `1234`. On the other hand,
`group-pm-with:1234` returned only group direct messages that included
the current user and the user with the unique user ID of `1234`.
Both `dm` and `is:dm` are aliases of `pm-with` and `is:private`
respectively, and return the same exact results that the deprecated
filters did.
## Narrows that use IDs
The `near` and `id` operators, documented in the help center, use message
IDs for their operands.
* `near:12345`: Search messages around the message with ID `12345`.
* `id:12345`: Search for only message with ID `12345`.
There are a few additional narrow/search options (new in Zulip 2.1)
that use either stream IDs or user IDs that are not documented in the
help center because they are primarily useful to API clients:
* `stream:1234`: Search messages sent to the stream with ID `1234`.
* `sender:1234`: Search messages sent by user ID `1234`.
* `dm:1234`: Search the direct message conversation between
you and user ID `1234`.
* `dm:1234,5678`: Search the direct message conversation between
you, user ID `1234`, and user ID `5678`.
* `dm-including:1234`: Search all direct messages (1-on-1 and group)
that include you and user ID `1234`.
The operands for these search options must be encoded either as an
integer ID or a JSON list of integer IDs. For example, to query
messages sent by a user 1234 to a PM thread with yourself, user 1234,
and user 5678, the correct JSON-encoded query is:
```json
[
{
"operator": "dm",
"operand": [1234, 5678]
},
{
"operator": "sender",
"operand": 1234
}
]
```

View File

@@ -1,49 +0,0 @@
{generate_api_header(/scheduled_messages:post)}
## Usage examples
{start_tabs}
{generate_code_example(python)|/scheduled_messages:post|example}
{generate_code_example(javascript)|/scheduled_messages:post|example}
{tab|curl}
``` curl
# Create a scheduled stream message
curl -X POST {{ api_url }}/v1/scheduled_messages \
-u BOT_EMAIL_ADDRESS:BOT_API_KEY \
--data-urlencode type=stream \
--data-urlencode to=9 \
--data-urlencode topic=Hello \
--data-urlencode 'content=Nice to meet everyone!' \
--data-urlencode scheduled_delivery_timestamp=3165826990
# Create a scheduled direct message
curl -X POST {{ api_url }}/v1/messages \
-u BOT_EMAIL_ADDRESS:BOT_API_KEY \
--data-urlencode type=direct \
--data-urlencode 'to=[9, 10]' \
--data-urlencode 'content=Can we meet on Monday?' \
--data-urlencode scheduled_delivery_timestamp=3165826990
```
{end_tabs}
## Parameters
{generate_api_arguments_table|zulip.yaml|/scheduled_messages:post}
{generate_parameter_description(/scheduled_messages:post)}
## Response
{generate_return_values_table|zulip.yaml|/scheduled_messages:post}
{generate_response_description(/scheduled_messages:post)}
#### Example response(s)
{generate_code_example|/scheduled_messages:post|fixture}

View File

@@ -1,80 +0,0 @@
# HTTP headers
This page documents the HTTP headers used by the Zulip API.
Most important is that API clients authenticate to the server using
HTTP Basic authentication. If you're using the official [Python or
JavaScript bindings](/api/installation-instructions), this is taken
care of when you configure said bindings.
Otherwise, see the `curl` example on each endpoint's documentation
page, which details the request format.
Documented below are additional HTTP headers and header conventions
generally used by Zulip:
## The `User-Agent` header
Clients are not required to pass a `User-Agent` HTTP header, but we
highly recommend doing so when writing an integration. It's easy to do
and it can help save time when debugging issues related to an API
client.
If provided, the Zulip server will parse the `User-Agent` HTTP header
in order to identify specific clients and integrations. This
information is used by the server for logging, [usage
statistics](/help/analytics), and on rare occasions, for
backwards-compatibility logic to preserve support for older versions
of official clients.
Official Zulip clients and integrations use a `User-Agent` that starts
with something like `ZulipMobile/20.0.103 `, encoding the name of the
application and it's version.
Zulip's official API bindings have reasonable defaults for
`User-Agent`. For example, the official Zulip Python bindings have a
default `User-Agent` starting with `ZulipPython/{version}`, where
`version` is the version of the library.
You can give your bot/integration its own name by passing the `client`
parameter when initializing the Python bindings. For example, the
official Zulip Nagios integration is initialized like this:
``` python
client = zulip.Client(
config_file=opts.config, client=f"ZulipNagios/{VERSION}"
)
```
If you are working on an integration that you plan to share outside
your organization, you can get help picking a good name in
`#integrations` in the [Zulip development
community](https://zulip.com/development-community).
## Rate-limiting response headers
To help clients avoid exceeding rate limits, Zulip sets the following
HTTP headers in all API responses:
* `X-RateLimit-Remaining`: The number of additional requests of this
type that the client can send before exceeding its limit.
* `X-RateLimit-Limit`: The limit that would be applicable to a client
that had not made any recent requests of this type. This is useful
for designing a client's burst behavior so as to avoid ever reaching
a rate limit.
* `X-RateLimit-Reset`: The time at which the client will no longer
have any rate limits applied to it (and thus could do a burst of
`X-RateLimit-Limit` requests).
[Zulip's rate limiting rules are configurable][rate-limiting-rules],
and can vary by server and over time. The default configuration
currently limits:
* Every user is limited to 200 total API requests per minute.
* Separate, much lower limits for authentication/login attempts.
When the Zulip server has configured multiple rate limits that apply
to a given request, the values returned will be for the strictest
limit.
[rate-limiting-rules]: https://zulip.readthedocs.io/en/latest/production/security-model.html#rate-limiting

View File

@@ -1,26 +0,0 @@
# The Zulip API
Zulip's APIs allow you to integrate other services with Zulip. This
guide should help you find the API you need:
* First, check if the tool you'd like to integrate with Zulip
[already has a native integration](/integrations/).
* Next, check if [Zapier](https://zapier.com/apps) or
[IFTTT](https://ifttt.com/search) has an integration.
[Zulip's Zapier integration](/integrations/doc/zapier) and
[Zulip's IFTTT integration](/integrations/doc/ifttt) often allow
integrating a new service with Zulip without writing any code.
* If you'd like to send content into Zulip, you can
[write a native incoming webhook integration](/api/incoming-webhooks-overview)
or use [Zulip's API for sending messages](/api/send-message).
* If you're building an interactive bot that reacts to activity inside
Zulip, you'll want to look at Zulip's
[Python framework for interactive bots](/api/running-bots) or
[Zulip's real-time events API](/api/get-events).
And if you still need to build your own integration with Zulip, check out
the full [REST API](/api/rest), generally starting with
[installing the API client bindings](/api/installation-instructions).
In case you already know how you want to build your integration and you're
just looking for an API key, we've got you covered [here](/api/api-keys).

View File

@@ -1,34 +0,0 @@
# Error handling
Zulip's API will always return a JSON format response.
The HTTP status code indicates whether the request was successful
(200 = success, 40x = user error, 50x = server error). Every response
will contain at least two keys: `msg` (a human-readable error message)
and `result`, which will be either `error` or `success` (this is
redundant with the HTTP status code, but is convenient when printing
responses while debugging).
For some common errors, Zulip provides a `code` attribute. Where
present, clients should check `code`, rather than `msg`, when looking
for specific error conditions, since the `msg` strings are
internationalized (e.g. the server will send the error message
translated into French if the user has a French locale).
Each endpoint documents its own unique errors; documented below are
errors common to many endpoints:
{generate_code_example|/rest-error-handling:post|fixture}
## Ignored Parameters
In JSON success responses, all Zulip REST API endpoints may return
an array of parameters sent in the request that are not supported
by that specific endpoint.
While this can be expected, e.g. when sending both current and legacy
names for a parameter to a Zulip server of unknown version, this often
indicates either a bug in the client implementation or an attempt to
configure a new feature while connected to an older Zulip server that
does not support said feature.
{generate_code_example|/settings:patch|fixture}

View File

@@ -1,120 +0,0 @@
# Roles and permissions
Zulip offers several levels of permissions based on a
[user's role](/help/roles-and-permissions) in a Zulip organization.
Here are some important details to note when working with these
roles and permissions in Zulip's API:
## A user's role
A user's account data include a `role` property, which contains the
user's role in the Zulip organization. These roles are encoded as:
* Organization owner: 100
* Organization administrator: 200
* Organization moderator: 300
* Member: 400
* Guest: 600
User account data also include these boolean properties that duplicate
the related roles above:
* `is_owner` specifying whether the user is an organization owner.
* `is_admin` specifying whether the user is an organization administrator.
* `is_guest` specifying whether the user is a guest user.
These are intended as conveniences for simple clients, and clients
should prefer using the `role` field, since only that one is updated
by the [events API](/api/get-events).
Note that [`POST /register`](/api/register-queue) also returns an
`is_moderator` boolean property specifying whether the current user is
an organization moderator.
Additionally, user account data include an `is_billing_admin` property
specifying whether the user is a billing administrator for the Zulip
organization, which is not related to one of the roles listed above,
but rather allows for specific permissions related to billing
administration in [paid Zulip Cloud plans](https://zulip.com/plans/).
### User account data in the API
Endpoints that return the user account data / properties mentioned
above are:
* [`GET /users`](/api/get-users)
* [`GET /users/{user_id}`](/api/get-user)
* [`GET /users/{email}`](/api/get-user-by-email)
* [`GET /users/me`](/api/get-own-user)
* [`GET /events`](/api/get-events)
* [`POST /register`](/api/register-queue)
Note that the [`POST /register` endpoint](/api/register-queue) returns
the above boolean properties to describe the role of the current user,
when `realm_user` is present in `fetch_event_types`.
Additionally, the specific events returned by the
[`GET /events` endpoint](/api/get-events) containing data related
to user accounts and roles are the [`realm_user` add
event](/api/get-events#realm_user-add), and the
[`realm_user` update event](/api/get-events#realm_user-update).
## Permission levels
Many areas of Zulip are customizable by the roles
above, such as (but not limited to) [restricting message editing and
deletion](/help/restrict-message-editing-and-deletion) and
[streams permissions](/help/stream-permissions). The potential
permission levels are:
* Everyone / Any user including Guests (least restrictive)
* Members
* Full members
* Moderators
* Administrators
* Owners
* Nobody (most restrictive)
These permission levels and policies in the API are designed to be
cutoffs in that users with the specified role and above have the
specified ability or access. For example, a permission level documented
as 'moderators only' includes organization moderators, administrators,
and owners.
Note that specific settings and policies in the Zulip API that use these
permission levels will likely support a subset of those listed above.
## Determining if a user is a full member
When a Zulip organization has set up a [waiting period before new members
turn into full members](/help/restrict-permissions-of-new-members),
clients will need to determine if a user's account has aged past the
organization's waiting period threshold.
The `realm_waiting_period_threshold`, which is the number of days until
a user's account is treated as a full member, is returned by the
[`POST /register` endpoint](/api/register-queue) when `realm` is present
in `fetch_event_types`.
Clients can compare the `realm_waiting_period_threshold` to a user
accounts's `date_joined` property, which is the time the user account
was created, to determine if a user has the permissions of a full
member or a new member.

View File

@@ -1,27 +0,0 @@
## Integrations
* [Overview](/api/integrations-overview)
* [Incoming webhook integrations](/api/incoming-webhooks-overview)
* [Hello world walkthrough](/api/incoming-webhooks-walkthrough)
* [Non-webhook integrations](/api/non-webhook-integrations)
## Interactive bots (beta)
* [Running bots](/api/running-bots)
* [Deploying bots](/api/deploying-bots)
* [Writing bots](/api/writing-bots)
* [Outgoing webhooks](/api/outgoing-webhooks)
## REST API
* [Overview](/api/rest)
* [Installation instructions](/api/installation-instructions)
* [API keys](/api/api-keys)
* [Configuring the Python bindings](/api/configuring-python-bindings)
* [HTTP headers](/api/http-headers)
* [Error handling](/api/rest-error-handling)
* [Roles and permissions](/api/roles-and-permissions)
* [Client libraries](/api/client-libraries)
* [API changelog](/api/changelog)
{!rest-endpoints.md!}

View File

@@ -14,7 +14,7 @@ module.exports = {
[
"@babel/preset-env",
{
corejs: "3.30",
corejs: "3.20",
shippedProposals: true,
useBuiltIns: "usage",
},

View File

@@ -3,6 +3,7 @@ from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
("contenttypes", "0001_initial"),
]

View File

@@ -3,6 +3,7 @@ from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
("confirmation", "0001_initial"),
]

View File

@@ -3,6 +3,7 @@ from django.db import migrations
class Migration(migrations.Migration):
dependencies = [
("confirmation", "0002_realmcreationkey"),
]

View File

@@ -3,6 +3,7 @@ from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
("confirmation", "0003_emailchangeconfirmation"),
]

View File

@@ -4,6 +4,7 @@ from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
("zerver", "0124_stream_enable_notifications"),
("confirmation", "0004_remove_confirmationmanager"),

View File

@@ -4,6 +4,7 @@ from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
("confirmation", "0005_confirmation_realm"),
]

View File

@@ -4,6 +4,7 @@ from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
("confirmation", "0006_realmcreationkey_presume_email_valid"),
]

View File

@@ -2,6 +2,7 @@ from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
("confirmation", "0007_add_indexes"),
]

View File

@@ -5,12 +5,12 @@ from datetime import timedelta
from django.conf import settings
from django.db import migrations, transaction
from django.db.backends.base.schema import BaseDatabaseSchemaEditor
from django.db.backends.postgresql.schema import DatabaseSchemaEditor
from django.db.migrations.state import StateApps
def set_expiry_date_for_existing_confirmations(
apps: StateApps, schema_editor: BaseDatabaseSchemaEditor
apps: StateApps, schema_editor: DatabaseSchemaEditor
) -> None:
Confirmation = apps.get_model("confirmation", "Confirmation")
if not Confirmation.objects.exists():

View File

@@ -4,6 +4,7 @@ from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
("confirmation", "0009_confirmation_expiry_date_backfill"),
]

View File

@@ -4,6 +4,7 @@ from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
("confirmation", "0010_alter_confirmation_expiry_date"),
]

View File

@@ -13,24 +13,24 @@ from django.contrib.contenttypes.models import ContentType
from django.db import models
from django.db.models import CASCADE
from django.http import HttpRequest, HttpResponse
from django.template.response import TemplateResponse
from django.shortcuts import render
from django.urls import reverse
from django.utils.timezone import now as timezone_now
from typing_extensions import Protocol
from confirmation import settings as confirmation_settings
from zerver.lib.types import UnspecifiedValue
from zerver.models import (
EmailChangeStatus,
MultiuseInvite,
PreregistrationRealm,
PreregistrationUser,
Realm,
RealmReactivationStatus,
UserProfile,
)
from zerver.models import EmailChangeStatus, MultiuseInvite, PreregistrationUser, Realm, UserProfile
class ConfirmationKeyError(Exception):
class HasRealmObject(Protocol):
realm: Realm
class OptionalHasRealmObject(Protocol):
realm: Optional[Realm]
class ConfirmationKeyException(Exception):
WRONG_LENGTH = 1
EXPIRED = 2
DOES_NOT_EXIST = 3
@@ -41,13 +41,13 @@ class ConfirmationKeyError(Exception):
def render_confirmation_key_error(
request: HttpRequest, exception: ConfirmationKeyError
request: HttpRequest, exception: ConfirmationKeyException
) -> HttpResponse:
if exception.error_type == ConfirmationKeyError.WRONG_LENGTH:
return TemplateResponse(request, "confirmation/link_malformed.html", status=404)
if exception.error_type == ConfirmationKeyError.EXPIRED:
return TemplateResponse(request, "confirmation/link_expired.html", status=404)
return TemplateResponse(request, "confirmation/link_does_not_exist.html", status=404)
if exception.error_type == ConfirmationKeyException.WRONG_LENGTH:
return render(request, "confirmation/link_malformed.html", status=404)
if exception.error_type == ConfirmationKeyException.EXPIRED:
return render(request, "confirmation/link_expired.html", status=404)
return render(request, "confirmation/link_does_not_exist.html", status=404)
def generate_key() -> str:
@@ -55,87 +55,58 @@ def generate_key() -> str:
return b32encode(secrets.token_bytes(15)).decode().lower()
ConfirmationObjT = Union[
MultiuseInvite,
PreregistrationRealm,
PreregistrationUser,
EmailChangeStatus,
UserProfile,
RealmReactivationStatus,
]
ConfirmationObjT = Union[MultiuseInvite, PreregistrationUser, EmailChangeStatus]
def get_object_from_key(
confirmation_key: str, confirmation_types: List[int], *, mark_object_used: bool
confirmation_key: str, confirmation_types: List[int], activate_object: bool = True
) -> ConfirmationObjT:
"""Access a confirmation object from one of the provided confirmation
types with the provided key.
The mark_object_used parameter determines whether to mark the
confirmation object as used (which generally prevents it from
being used again). It should always be False for MultiuseInvite
objects, since they are intended to be used multiple times.
"""
# Confirmation keys used to be 40 characters
if len(confirmation_key) not in (24, 40):
raise ConfirmationKeyError(ConfirmationKeyError.WRONG_LENGTH)
raise ConfirmationKeyException(ConfirmationKeyException.WRONG_LENGTH)
try:
confirmation = Confirmation.objects.get(
confirmation_key=confirmation_key, type__in=confirmation_types
)
except Confirmation.DoesNotExist:
raise ConfirmationKeyError(ConfirmationKeyError.DOES_NOT_EXIST)
raise ConfirmationKeyException(ConfirmationKeyException.DOES_NOT_EXIST)
if confirmation.expiry_date is not None and timezone_now() > confirmation.expiry_date:
raise ConfirmationKeyError(ConfirmationKeyError.EXPIRED)
raise ConfirmationKeyException(ConfirmationKeyException.EXPIRED)
obj = confirmation.content_object
assert obj is not None
used_value = confirmation_settings.STATUS_USED
revoked_value = confirmation_settings.STATUS_REVOKED
if hasattr(obj, "status") and obj.status in [used_value, revoked_value]:
# Confirmations where the object has the status attribute are one-time use
# and are marked after being used (or revoked).
raise ConfirmationKeyError(ConfirmationKeyError.EXPIRED)
if mark_object_used:
# MultiuseInvite objects do not use the STATUS_USED status, since they are
# intended to be used more than once.
assert confirmation.type != Confirmation.MULTIUSE_INVITE
assert hasattr(obj, "status")
obj.status = getattr(settings, "STATUS_USED", 1)
if activate_object and hasattr(obj, "status"):
obj.status = getattr(settings, "STATUS_ACTIVE", 1)
obj.save(update_fields=["status"])
return obj
def create_confirmation_link(
obj: ConfirmationObjT,
obj: Union[Realm, HasRealmObject, OptionalHasRealmObject],
confirmation_type: int,
*,
validity_in_minutes: Union[Optional[int], UnspecifiedValue] = UnspecifiedValue(),
validity_in_days: Union[Optional[int], UnspecifiedValue] = UnspecifiedValue(),
url_args: Mapping[str, str] = {},
realm_creation: bool = False,
) -> str:
# validity_in_minutes is an override for the default values which are
# validity_in_days is an override for the default values which are
# determined by the confirmation_type - its main purpose is for use
# in tests which may want to have control over the exact expiration time.
key = generate_key()
if realm_creation:
realm = None
else:
assert not isinstance(obj, PreregistrationRealm)
realm = None
if isinstance(obj, Realm):
realm = obj
elif hasattr(obj, "realm"):
realm = obj.realm
current_time = timezone_now()
expiry_date = None
if not isinstance(validity_in_minutes, UnspecifiedValue):
if validity_in_minutes is None:
if not isinstance(validity_in_days, UnspecifiedValue):
if validity_in_days is None:
expiry_date = None
else:
assert validity_in_minutes is not None
expiry_date = current_time + datetime.timedelta(minutes=validity_in_minutes)
assert validity_in_days is not None
expiry_date = current_time + datetime.timedelta(days=validity_in_days)
else:
expiry_date = current_time + datetime.timedelta(
days=_properties[confirmation_type].validity_in_days
@@ -168,12 +139,12 @@ def confirmation_url(
class Confirmation(models.Model):
content_type = models.ForeignKey(ContentType, on_delete=CASCADE)
object_id = models.PositiveIntegerField(db_index=True)
object_id: int = models.PositiveIntegerField(db_index=True)
content_object = GenericForeignKey("content_type", "object_id")
date_sent = models.DateTimeField(db_index=True)
confirmation_key = models.CharField(max_length=40, db_index=True)
expiry_date = models.DateTimeField(db_index=True, null=True)
realm = models.ForeignKey(Realm, null=True, on_delete=CASCADE)
date_sent: datetime.datetime = models.DateTimeField(db_index=True)
confirmation_key: str = models.CharField(max_length=40, db_index=True)
expiry_date: Optional[datetime.datetime] = models.DateTimeField(db_index=True, null=True)
realm: Optional[Realm] = models.ForeignKey(Realm, null=True, on_delete=CASCADE)
# The following list is the set of valid types
USER_REGISTRATION = 1
@@ -184,14 +155,14 @@ class Confirmation(models.Model):
MULTIUSE_INVITE = 6
REALM_CREATION = 7
REALM_REACTIVATION = 8
type = models.PositiveSmallIntegerField()
type: int = models.PositiveSmallIntegerField()
def __str__(self) -> str:
return f"<Confirmation: {self.content_object}>"
class Meta:
unique_together = ("type", "confirmation_key")
def __str__(self) -> str:
return f"{self.content_object!r}"
class ConfirmationType:
def __init__(
@@ -247,10 +218,10 @@ def validate_key(creation_key: Optional[str]) -> Optional["RealmCreationKey"]:
try:
key_record = RealmCreationKey.objects.get(creation_key=creation_key)
except RealmCreationKey.DoesNotExist:
raise RealmCreationKey.InvalidError
raise RealmCreationKey.Invalid()
time_elapsed = timezone_now() - key_record.date_created
if time_elapsed.total_seconds() > settings.REALM_CREATION_LINK_VALIDITY_DAYS * 24 * 3600:
raise RealmCreationKey.InvalidError
raise RealmCreationKey.Invalid()
return key_record
@@ -271,7 +242,7 @@ class RealmCreationKey(models.Model):
# True just if we should presume the email address the user enters
# is theirs, and skip sending mail to it to confirm that.
presume_email_valid = models.BooleanField(default=False)
presume_email_valid: bool = models.BooleanField(default=False)
class InvalidError(Exception):
class Invalid(Exception):
pass

View File

@@ -2,5 +2,5 @@
__revision__ = "$Id: settings.py 12 2008-11-23 19:38:52Z jarek.zgoda $"
STATUS_USED = 1
STATUS_ACTIVE = 1
STATUS_REVOKED = 2

View File

@@ -3,11 +3,11 @@ from typing import Optional
from django.conf import settings
from django.utils.translation import gettext as _
from corporate.lib.stripe import LicenseLimitError, get_latest_seat_count, get_seat_count
from corporate.lib.stripe import LicenseLimitError, get_latest_seat_count
from corporate.models import get_current_plan_by_realm
from zerver.actions.create_user import send_message_to_signup_notification_stream
from zerver.lib.exceptions import InvitationError
from zerver.models import Realm, UserProfile, get_system_bot
from zerver.models import Realm, get_system_bot
def generate_licenses_low_warning_message_if_required(realm: Realm) -> Optional[str]:
@@ -69,41 +69,33 @@ def send_user_unable_to_signup_message_to_signup_notification_stream(
def check_spare_licenses_available_for_adding_new_users(
realm: Realm, extra_non_guests_count: int = 0, extra_guests_count: int = 0
realm: Realm, number_of_users_to_add: int
) -> None:
plan = get_current_plan_by_realm(realm)
if plan is None or plan.automanage_licenses or plan.customer.exempt_from_license_number_check:
if (
plan is None
or plan.automanage_licenses
or plan.customer.exempt_from_from_license_number_check
):
return
if plan.licenses() < get_seat_count(
realm, extra_non_guests_count=extra_non_guests_count, extra_guests_count=extra_guests_count
):
raise LicenseLimitError
if plan.licenses() < get_latest_seat_count(realm) + number_of_users_to_add:
raise LicenseLimitError()
def check_spare_licenses_available_for_registering_new_user(
realm: Realm,
user_email_to_add: str,
role: int,
realm: Realm, user_email_to_add: str
) -> None:
try:
if role == UserProfile.ROLE_GUEST:
check_spare_licenses_available_for_adding_new_users(realm, extra_guests_count=1)
else:
check_spare_licenses_available_for_adding_new_users(realm, extra_non_guests_count=1)
check_spare_licenses_available_for_adding_new_users(realm, 1)
except LicenseLimitError:
send_user_unable_to_signup_message_to_signup_notification_stream(realm, user_email_to_add)
raise
def check_spare_licenses_available_for_inviting_new_users(
realm: Realm, extra_non_guests_count: int = 0, extra_guests_count: int = 0
) -> None:
num_invites = extra_non_guests_count + extra_guests_count
def check_spare_licenses_available_for_inviting_new_users(realm: Realm, num_invites: int) -> None:
try:
check_spare_licenses_available_for_adding_new_users(
realm, extra_non_guests_count, extra_guests_count
)
check_spare_licenses_available_for_adding_new_users(realm, num_invites)
except LicenseLimitError:
if num_invites == 1:
message = _("All Zulip licenses for this organization are currently in use.")

View File

@@ -5,7 +5,7 @@ import secrets
from datetime import datetime, timedelta
from decimal import Decimal
from functools import wraps
from typing import Any, Callable, Dict, Generator, Optional, Tuple, TypeVar, Union
from typing import Any, Callable, Dict, Generator, Optional, Tuple, TypeVar, Union, cast
import orjson
import stripe
@@ -17,7 +17,6 @@ from django.utils.timezone import now as timezone_now
from django.utils.translation import gettext as _
from django.utils.translation import gettext_lazy
from django.utils.translation import override as override_language
from typing_extensions import ParamSpec
from corporate.models import (
Customer,
@@ -46,8 +45,7 @@ billing_logger = logging.getLogger("corporate.stripe")
log_to_file(billing_logger, BILLING_LOG_PATH)
log_to_file(logging.getLogger("stripe"), BILLING_LOG_PATH)
ParamT = ParamSpec("ParamT")
ReturnT = TypeVar("ReturnT")
CallableT = TypeVar("CallableT", bound=Callable[..., object])
MIN_INVOICED_LICENSES = 30
MAX_INVOICED_LICENSES = 1000
@@ -58,29 +56,14 @@ STRIPE_API_VERSION = "2020-08-27"
def get_latest_seat_count(realm: Realm) -> int:
return get_seat_count(realm, extra_non_guests_count=0, extra_guests_count=0)
def get_seat_count(
realm: Realm, extra_non_guests_count: int = 0, extra_guests_count: int = 0
) -> int:
non_guests = (
UserProfile.objects.filter(realm=realm, is_active=True, is_bot=False)
.exclude(role=UserProfile.ROLE_GUEST)
.count()
) + extra_non_guests_count
# This guest count calculation should match the similar query in render_stats().
guests = (
UserProfile.objects.filter(
realm=realm, is_active=True, is_bot=False, role=UserProfile.ROLE_GUEST
).count()
+ extra_guests_count
)
# This formula achieves the pricing of the first 5*N guests
# being free of charge (where N is the number of non-guests in the organization)
# and each consecutive one being worth 1/5 the non-guest price.
guests = UserProfile.objects.filter(
realm=realm, is_active=True, is_bot=False, role=UserProfile.ROLE_GUEST
).count()
return max(non_guests, math.ceil(guests / 5))
@@ -95,19 +78,14 @@ def unsign_string(signed_string: str, salt: str) -> str:
return signer.unsign(signed_string)
def validate_licenses(
charge_automatically: bool,
licenses: Optional[int],
seat_count: int,
exempt_from_license_number_check: bool,
) -> None:
def validate_licenses(charge_automatically: bool, licenses: Optional[int], seat_count: int) -> None:
min_licenses = seat_count
max_licenses = None
if not charge_automatically:
min_licenses = max(seat_count, MIN_INVOICED_LICENSES)
max_licenses = MAX_INVOICED_LICENSES
if licenses is None or (not exempt_from_license_number_check and licenses < min_licenses):
if licenses is None or licenses < min_licenses:
raise BillingError(
"not enough licenses", _("You must invoice for at least {} users.").format(min_licenses)
)
@@ -253,21 +231,21 @@ class UpgradeWithExistingPlanError(BillingError):
)
class InvalidBillingScheduleError(Exception):
class InvalidBillingSchedule(Exception):
def __init__(self, billing_schedule: int) -> None:
self.message = f"Unknown billing_schedule: {billing_schedule}"
super().__init__(self.message)
class InvalidTierError(Exception):
class InvalidTier(Exception):
def __init__(self, tier: int) -> None:
self.message = f"Unknown tier: {tier}"
super().__init__(self.message)
def catch_stripe_errors(func: Callable[ParamT, ReturnT]) -> Callable[ParamT, ReturnT]:
def catch_stripe_errors(func: CallableT) -> CallableT:
@wraps(func)
def wrapped(*args: ParamT.args, **kwargs: ParamT.kwargs) -> ReturnT:
def wrapped(*args: object, **kwargs: object) -> object:
try:
return func(*args, **kwargs)
# See https://stripe.com/docs/api/python#error_handling, though
@@ -301,7 +279,7 @@ def catch_stripe_errors(func: Callable[ParamT, ReturnT]) -> Callable[ParamT, Ret
)
raise BillingError("other stripe error")
return wrapped
return cast(CallableT, wrapped)
@catch_stripe_errors
@@ -524,9 +502,7 @@ def make_end_of_cycle_updates_if_needed(
standard_plan_last_ledger = (
LicenseLedger.objects.filter(plan=standard_plan).order_by("id").last()
)
assert standard_plan_last_ledger is not None
licenses_for_plus_plan = standard_plan_last_ledger.licenses_at_next_renewal
assert licenses_for_plus_plan is not None
plus_plan_ledger_entry = LicenseLedger.objects.create(
plan=plus_plan,
is_renewal=True,
@@ -575,16 +551,16 @@ def get_price_per_license(
elif billing_schedule == CustomerPlan.MONTHLY:
price_per_license = 800
else: # nocoverage
raise InvalidBillingScheduleError(billing_schedule)
raise InvalidBillingSchedule(billing_schedule)
elif tier == CustomerPlan.PLUS:
if billing_schedule == CustomerPlan.ANNUAL:
price_per_license = 16000
elif billing_schedule == CustomerPlan.MONTHLY:
price_per_license = 1600
else: # nocoverage
raise InvalidBillingScheduleError(billing_schedule)
raise InvalidBillingSchedule(billing_schedule)
else:
raise InvalidTierError(tier)
raise InvalidTier(tier)
if discount is not None:
price_per_license = calculate_discounted_price_per_license(price_per_license, discount)
@@ -607,7 +583,7 @@ def compute_plan_parameters(
elif billing_schedule == CustomerPlan.MONTHLY:
period_end = add_months(billing_cycle_anchor, 1)
else: # nocoverage
raise InvalidBillingScheduleError(billing_schedule)
raise InvalidBillingSchedule(billing_schedule)
price_per_license = get_price_per_license(tier, billing_schedule, discount)
@@ -632,7 +608,7 @@ def is_free_trial_offer_enabled() -> bool:
return settings.FREE_TRIAL_DAYS not in (None, 0)
def ensure_realm_does_not_have_active_plan(realm: Realm) -> None:
def ensure_realm_does_not_have_active_plan(realm: Customer) -> None:
if get_current_plan_by_realm(realm) is not None:
# Unlikely race condition from two people upgrading (clicking "Make payment")
# at exactly the same time. Doesn't fully resolve the race condition, but having
@@ -641,7 +617,7 @@ def ensure_realm_does_not_have_active_plan(realm: Realm) -> None:
"Upgrade of %s failed because of existing active plan.",
realm.string_id,
)
raise UpgradeWithExistingPlanError
raise UpgradeWithExistingPlanError()
@transaction.atomic
@@ -653,7 +629,7 @@ def do_change_remote_server_plan_type(remote_server: RemoteZulipServer, plan_typ
event_type=RealmAuditLog.REMOTE_SERVER_PLAN_TYPE_CHANGED,
server=remote_server,
event_time=timezone_now(),
extra_data=str({"old_value": old_value, "new_value": plan_type}),
extra_data={"old_value": old_value, "new_value": plan_type},
)
@@ -661,8 +637,8 @@ def do_change_remote_server_plan_type(remote_server: RemoteZulipServer, plan_typ
def do_deactivate_remote_server(remote_server: RemoteZulipServer) -> None:
if remote_server.deactivated:
billing_logger.warning(
"Cannot deactivate remote server with ID %d, server has already been deactivated.",
remote_server.id,
f"Cannot deactivate remote server with ID {remote_server.id}, "
"server has already been deactivated."
)
return
@@ -940,9 +916,7 @@ def invoice_plan(plan: CustomerPlan, event_time: datetime) -> None:
plan.save(update_fields=["next_invoice_date"])
def invoice_plans_as_needed(event_time: Optional[datetime] = None) -> None:
if event_time is None: # nocoverage
event_time = timezone_now()
def invoice_plans_as_needed(event_time: datetime = timezone_now()) -> None:
for plan in CustomerPlan.objects.filter(next_invoice_date__lte=event_time):
invoice_plan(plan, event_time)
@@ -973,7 +947,7 @@ def attach_discount_to_realm(
acting_user=acting_user,
event_type=RealmAuditLog.REALM_DISCOUNT_CHANGED,
event_time=timezone_now(),
extra_data=str({"old_discount": old_discount, "new_discount": discount}),
extra_data={"old_discount": old_discount, "new_discount": discount},
)
@@ -988,7 +962,9 @@ def update_sponsorship_status(
acting_user=acting_user,
event_type=RealmAuditLog.REALM_SPONSORSHIP_PENDING_STATUS_CHANGED,
event_time=timezone_now(),
extra_data=str({"sponsorship_pending": sponsorship_pending}),
extra_data={
"sponsorship_pending": sponsorship_pending,
},
)
@@ -1011,16 +987,11 @@ def approve_sponsorship(realm: Realm, *, acting_user: Optional[UserProfile]) ->
for user in realm.get_human_billing_admin_and_realm_owner_users():
with override_language(user.default_language):
# Using variable to make life easier for translators if these details change.
plan_name = "Zulip Cloud Standard"
emoji = ":tada:"
message = _(
"Your organization's request for sponsored hosting has been approved! "
"You have been upgraded to {plan_name}, free of charge. {emoji}\n\n"
"If you could {begin_link}list Zulip as a sponsor on your website{end_link}, "
"we would really appreciate it!"
).format(
plan_name="Zulip Cloud Standard",
emoji=":tada:",
begin_link="[",
end_link="](/help/linking-to-zulip-website)",
f"Your organization's request for sponsored hosting has been approved! {emoji}.\n"
f"You have been upgraded to {plan_name}, free of charge."
)
internal_send_private_message(notification_bot, user, message)
@@ -1067,7 +1038,6 @@ def estimate_annual_recurring_revenue_by_realm() -> Dict[str, int]: # nocoverag
if plan.billing_schedule == CustomerPlan.MONTHLY:
renewal_cents *= 12
# TODO: Decimal stuff
assert plan.customer.realm is not None
annual_revenue[plan.customer.realm.string_id] = int(renewal_cents / 100)
return annual_revenue
@@ -1076,7 +1046,6 @@ def get_realms_to_default_discount_dict() -> Dict[str, Decimal]:
realms_to_default_discount: Dict[str, Any] = {}
customers = Customer.objects.exclude(default_discount=None).exclude(default_discount=0)
for customer in customers:
assert customer.realm is not None
realms_to_default_discount[customer.realm.string_id] = assert_is_not_none(
customer.default_discount
)
@@ -1145,7 +1114,6 @@ def downgrade_small_realms_behind_on_payments_as_needed() -> None:
customers = Customer.objects.all().exclude(stripe_customer_id=None)
for customer in customers:
realm = customer.realm
assert realm is not None
# For larger realms, we generally want to talk to the customer
# before downgrading or cancelling invoices; so this logic only applies with 5.
@@ -1197,17 +1165,11 @@ def switch_realm_from_standard_to_plus_plan(realm: Realm) -> None:
standard_plan.next_invoice_date = plan_switch_time
standard_plan.save(update_fields=["status", "next_invoice_date"])
from zerver.actions.realm_settings import do_change_realm_plan_type
do_change_realm_plan_type(realm, Realm.PLAN_TYPE_PLUS, acting_user=None)
standard_plan_next_renewal_date = start_of_next_billing_cycle(standard_plan, plan_switch_time)
standard_plan_last_renewal_ledger = (
LicenseLedger.objects.filter(is_renewal=True, plan=standard_plan).order_by("id").last()
)
assert standard_plan_last_renewal_ledger is not None
assert standard_plan.price_per_license is not None
standard_plan_last_renewal_amount = (
standard_plan_last_renewal_ledger.licenses * standard_plan.price_per_license
)
@@ -1242,5 +1204,7 @@ def update_billing_method_of_current_plan(
acting_user=acting_user,
event_type=RealmAuditLog.REALM_BILLING_METHOD_CHANGED,
event_time=timezone_now(),
extra_data=str({"charge_automatically": charge_automatically}),
extra_data={
"charge_automatically": charge_automatically,
},
)

View File

@@ -1,5 +1,4 @@
import logging
from contextlib import suppress
from typing import Any, Callable, Dict, Union
import stripe
@@ -83,16 +82,17 @@ def handle_checkout_session_completed_event(
]:
ensure_realm_does_not_have_active_plan(user.realm)
update_or_create_stripe_customer(user, payment_method)
assert session.payment_intent is not None
session.payment_intent.status = PaymentIntent.PROCESSING
session.payment_intent.last_payment_error = ()
session.payment_intent.save(update_fields=["status", "last_payment_error"])
with suppress(stripe.error.CardError):
try:
stripe.PaymentIntent.confirm(
session.payment_intent.stripe_payment_intent_id,
payment_method=payment_method,
off_session=True,
)
except stripe.error.CardError:
pass
elif session.type in [
Session.FREE_TRIAL_UPGRADE_FROM_BILLING_PAGE,
Session.FREE_TRIAL_UPGRADE_FROM_ONBOARDING_PAGE,
@@ -160,12 +160,11 @@ def handle_payment_intent_succeeded_event(
@error_handler
def handle_payment_intent_payment_failed_event(
stripe_payment_intent: stripe.PaymentIntent, payment_intent: PaymentIntent
stripe_payment_intent: stripe.PaymentIntent, payment_intent: Event
) -> None:
payment_intent.status = PaymentIntent.get_status_integer_from_status_text(
stripe_payment_intent.status
)
assert payment_intent.customer.realm is not None
billing_logger.info(
"Stripe payment intent failed: %s %s %s %s",
payment_intent.customer.realm.string_id,

View File

@@ -5,6 +5,7 @@ from django.db import migrations, models
class Migration(migrations.Migration):
initial = True
dependencies = [

View File

@@ -4,6 +4,7 @@ from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
("corporate", "0001_initial"),
]

View File

@@ -5,6 +5,7 @@ from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
("corporate", "0002_customer_default_discount"),
]

View File

@@ -5,6 +5,7 @@ from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
("corporate", "0003_customerplan"),
]

View File

@@ -5,6 +5,7 @@ from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
("corporate", "0004_licenseledger"),
]

View File

@@ -4,6 +4,7 @@ from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
("corporate", "0005_customerplan_invoicing"),
]

View File

@@ -4,6 +4,7 @@ from django.db import migrations
class Migration(migrations.Migration):
dependencies = [
("corporate", "0006_nullable_stripe_customer_id"),
]

View File

@@ -4,6 +4,7 @@ from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
("corporate", "0007_remove_deprecated_fields"),
]

View File

@@ -4,6 +4,7 @@ from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
("corporate", "0008_nullable_next_invoice_date"),
]

View File

@@ -4,6 +4,7 @@ from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
("corporate", "0009_customer_sponsorship_pending"),
]

View File

@@ -6,6 +6,7 @@ from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
migrations.swappable_dependency(settings.AUTH_USER_MODEL),
("zerver", "0333_alter_realm_org_type"),

View File

@@ -4,6 +4,7 @@ from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
("corporate", "0012_zulipsponsorshiprequest"),
]

View File

@@ -4,6 +4,7 @@ from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
("corporate", "0013_alter_zulipsponsorshiprequest_org_website"),
]

View File

@@ -5,6 +5,7 @@ from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
("contenttypes", "0002_remove_content_type_name"),
("corporate", "0014_customerplan_end_date"),

View File

@@ -5,6 +5,7 @@ from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
("zilencer", "0018_remoterealmauditlog"),
("zerver", "0370_realm_enable_spectator_access"),

View File

@@ -1,17 +0,0 @@
# Generated by Django 4.2 on 2023-04-10 18:22
from django.db import migrations
class Migration(migrations.Migration):
dependencies = [
("corporate", "0016_customer_add_remote_server_field"),
]
operations = [
migrations.RenameField(
model_name="customer",
old_name="exempt_from_from_license_number_check",
new_name="exempt_from_license_number_check",
),
]

View File

@@ -1,3 +1,5 @@
import datetime
from decimal import Decimal
from typing import Any, Dict, Optional, Union
from django.contrib.contenttypes.fields import GenericForeignKey
@@ -16,20 +18,21 @@ class Customer(models.Model):
and the active plan, if any.
"""
realm = models.OneToOneField(Realm, on_delete=CASCADE, null=True)
remote_server = models.OneToOneField(RemoteZulipServer, on_delete=CASCADE, null=True)
stripe_customer_id = models.CharField(max_length=255, null=True, unique=True)
sponsorship_pending = models.BooleanField(default=False)
realm: Optional[Realm] = models.OneToOneField(Realm, on_delete=CASCADE, null=True)
remote_server: Optional[RemoteZulipServer] = models.OneToOneField(
RemoteZulipServer, on_delete=CASCADE, null=True
)
stripe_customer_id: Optional[str] = models.CharField(max_length=255, null=True, unique=True)
sponsorship_pending: bool = models.BooleanField(default=False)
# A percentage, like 85.
default_discount = models.DecimalField(decimal_places=4, max_digits=7, null=True)
default_discount: Optional[Decimal] = models.DecimalField(
decimal_places=4, max_digits=7, null=True
)
# Some non-profit organizations on manual license management pay
# only for their paid employees. We don't prevent these
# organizations from adding more users than the number of licenses
# they purchased.
exempt_from_license_number_check = models.BooleanField(default=False)
def __str__(self) -> str:
return f"{self.realm!r} {self.stripe_customer_id}"
exempt_from_from_license_number_check: bool = models.BooleanField(default=False)
@property
def is_self_hosted(self) -> bool:
@@ -45,6 +48,9 @@ class Customer(models.Model):
assert self.remote_server is None
return is_cloud
def __str__(self) -> str:
return f"<Customer {self.realm} {self.stripe_customer_id}>"
def get_customer_by_realm(realm: Realm) -> Optional[Customer]:
return Customer.objects.filter(realm=realm).first()
@@ -90,8 +96,8 @@ def get_last_associated_event_by_type(
class Session(models.Model):
customer = models.ForeignKey(Customer, on_delete=CASCADE)
stripe_session_id = models.CharField(max_length=255, unique=True)
customer: Customer = models.ForeignKey(Customer, on_delete=CASCADE)
stripe_session_id: str = models.CharField(max_length=255, unique=True)
payment_intent = models.ForeignKey("PaymentIntent", null=True, on_delete=CASCADE)
UPGRADE_FROM_BILLING_PAGE = 1
@@ -99,11 +105,11 @@ class Session(models.Model):
FREE_TRIAL_UPGRADE_FROM_BILLING_PAGE = 20
FREE_TRIAL_UPGRADE_FROM_ONBOARDING_PAGE = 30
CARD_UPDATE_FROM_BILLING_PAGE = 40
type = models.SmallIntegerField()
type: int = models.SmallIntegerField()
CREATED = 1
COMPLETED = 10
status = models.SmallIntegerField(default=CREATED)
status: int = models.SmallIntegerField(default=CREATED)
def get_status_as_string(self) -> str:
return {Session.CREATED: "created", Session.COMPLETED: "completed"}[self.status]
@@ -136,8 +142,8 @@ class Session(models.Model):
class PaymentIntent(models.Model):
customer = models.ForeignKey(Customer, on_delete=CASCADE)
stripe_payment_intent_id = models.CharField(max_length=255, unique=True)
customer: Customer = models.ForeignKey(Customer, on_delete=CASCADE)
stripe_payment_intent_id: str = models.CharField(max_length=255, unique=True)
REQUIRES_PAYMENT_METHOD = 1
REQUIRES_CONFIRMATION = 20
@@ -147,7 +153,7 @@ class PaymentIntent(models.Model):
CANCELLED = 60
SUCCEEDED = 70
status = models.SmallIntegerField()
status: int = models.SmallIntegerField()
last_payment_error = models.JSONField(default=None, null=True)
@classmethod
@@ -194,47 +200,47 @@ class CustomerPlan(models.Model):
# A customer can only have one ACTIVE plan, but old, inactive plans
# are preserved to allow auditing - so there can be multiple
# CustomerPlan objects pointing to one Customer.
customer = models.ForeignKey(Customer, on_delete=CASCADE)
customer: Customer = models.ForeignKey(Customer, on_delete=CASCADE)
automanage_licenses = models.BooleanField(default=False)
charge_automatically = models.BooleanField(default=False)
automanage_licenses: bool = models.BooleanField(default=False)
charge_automatically: bool = models.BooleanField(default=False)
# Both of these are in cents. Exactly one of price_per_license or
# fixed_price should be set. fixed_price is only for manual deals, and
# can't be set via the self-serve billing system.
price_per_license = models.IntegerField(null=True)
fixed_price = models.IntegerField(null=True)
price_per_license: Optional[int] = models.IntegerField(null=True)
fixed_price: Optional[int] = models.IntegerField(null=True)
# Discount that was applied. For display purposes only.
discount = models.DecimalField(decimal_places=4, max_digits=6, null=True)
discount: Optional[Decimal] = models.DecimalField(decimal_places=4, max_digits=6, null=True)
# Initialized with the time of plan creation. Used for calculating
# start of next billing cycle, next invoice date etc. This value
# should never be modified. The only exception is when we change
# the status of the plan from free trial to active and reset the
# billing_cycle_anchor.
billing_cycle_anchor = models.DateTimeField()
billing_cycle_anchor: datetime.datetime = models.DateTimeField()
ANNUAL = 1
MONTHLY = 2
billing_schedule = models.SmallIntegerField()
billing_schedule: int = models.SmallIntegerField()
# The next date the billing system should go through ledger
# entries and create invoices for additional users or plan
# renewal. Since we use a daily cron job for invoicing, the
# invoice will be generated the first time the cron job runs after
# next_invoice_date.
next_invoice_date = models.DateTimeField(db_index=True, null=True)
next_invoice_date: Optional[datetime.datetime] = models.DateTimeField(db_index=True, null=True)
# On next_invoice_date, we go through ledger entries that were
# created after invoiced_through and process them by generating
# invoices for any additional users and/or plan renewal. Once the
# invoice is generated, we update the value of invoiced_through
# and set it to the last ledger entry we processed.
invoiced_through = models.ForeignKey(
invoiced_through: Optional["LicenseLedger"] = models.ForeignKey(
"LicenseLedger", null=True, on_delete=CASCADE, related_name="+"
)
end_date = models.DateTimeField(null=True)
end_date: Optional[datetime.datetime] = models.DateTimeField(null=True)
DONE = 1
STARTED = 2
@@ -242,12 +248,12 @@ class CustomerPlan(models.Model):
# This status field helps ensure any errors encountered during the
# invoicing process do not leave our invoicing system in a broken
# state.
invoicing_status = models.SmallIntegerField(default=DONE)
invoicing_status: int = models.SmallIntegerField(default=DONE)
STANDARD = 1
PLUS = 2 # not available through self-serve signup
ENTERPRISE = 10
tier = models.SmallIntegerField()
tier: int = models.SmallIntegerField()
ACTIVE = 1
DOWNGRADE_AT_END_OF_CYCLE = 2
@@ -259,7 +265,7 @@ class CustomerPlan(models.Model):
LIVE_STATUS_THRESHOLD = 10
ENDED = 11
NEVER_STARTED = 12
status = models.SmallIntegerField(default=ACTIVE)
status: int = models.SmallIntegerField(default=ACTIVE)
# TODO maybe override setattr to ensure billing_cycle_anchor, etc
# are immutable.
@@ -323,37 +329,38 @@ class LicenseLedger(models.Model):
in case of issues.
"""
plan = models.ForeignKey(CustomerPlan, on_delete=CASCADE)
plan: CustomerPlan = models.ForeignKey(CustomerPlan, on_delete=CASCADE)
# Also True for the initial upgrade.
is_renewal = models.BooleanField(default=False)
is_renewal: bool = models.BooleanField(default=False)
event_time = models.DateTimeField()
event_time: datetime.datetime = models.DateTimeField()
# The number of licenses ("seats") purchased by the the organization at the time of ledger
# entry creation. Normally, to add a user the organization needs at least one spare license.
# Once a license is purchased, it is valid till the end of the billing period, irrespective
# of whether the license is used or not. So the value of licenses will never decrease for
# subsequent LicenseLedger entries in the same billing period.
licenses = models.IntegerField()
licenses: int = models.IntegerField()
# The number of licenses the organization needs in the next billing cycle. The value of
# licenses_at_next_renewal can increase or decrease for subsequent LicenseLedger entries in
# the same billing period. For plans on automatic license management this value is usually
# equal to the number of activated users in the organization.
licenses_at_next_renewal = models.IntegerField(null=True)
licenses_at_next_renewal: Optional[int] = models.IntegerField(null=True)
class ZulipSponsorshipRequest(models.Model):
realm = models.ForeignKey(Realm, on_delete=CASCADE)
requested_by = models.ForeignKey(UserProfile, on_delete=CASCADE)
id: int = models.AutoField(auto_created=True, primary_key=True, verbose_name="ID")
realm: Realm = models.ForeignKey(Realm, on_delete=CASCADE)
requested_by: UserProfile = models.ForeignKey(UserProfile, on_delete=CASCADE)
org_type = models.PositiveSmallIntegerField(
org_type: int = models.PositiveSmallIntegerField(
default=Realm.ORG_TYPES["unspecified"]["id"],
choices=[(t["id"], t["name"]) for t in Realm.ORG_TYPES.values()],
)
MAX_ORG_URL_LENGTH: int = 200
org_website = models.URLField(max_length=MAX_ORG_URL_LENGTH, blank=True, null=True)
org_website: str = models.URLField(max_length=MAX_ORG_URL_LENGTH, blank=True, null=True)
org_description = models.TextField(default="")
org_description: str = models.TextField(default="")

Some files were not shown because too many files have changed in this diff Show More